#Space

Satellite-Derived Bathymetry: Mapping the Unseen Seafloor from Space

Mapping the global seabed is essential for understanding our planet and its climate. Yet with roughly 75% of the ocean floor still unmapped, vast areas of our oceans remain largely unexplored.

For everyday life, coastal zones are especially critical. To monitor and anticipate coastal change, we require accurate, large-scale measurements of key parameters such as bathymetry. However, nearly half of the world’s coastal waters remain unsurveyed today.

Image showing areas of global seafloor considered to be mapped by General Bathymetric Chart of the Oceans (GEBCO). Grey areas depict coverage of as of 2021, red areas are additions for 2022. Credit: The Nippon Foundation-GEBCO Seabed 2030 Global Center (GDACC) on behalf of Seabed 2030; Source: seabed2030.org

Our vital coastal regions

Shallow coastal waters are home to about 10% of the global population and include many megacities. These low-lying areas are crucial for industrial activities such as maritime navigation, as well as for coastal protection and resource management. At the same time, they are highly dynamic and vulnerable to both natural and human-driven pressures, including erosion, extreme weather, sea-level rise, and demographic shifts. Continuous coastal monitoring is therefore indispensable.

Collecting bathymetric data in shallow waters has long been challenging. Conventional techniques like sonar surveys or airborne LiDAR are expensive, time-intensive, and logistically complex. As a result, data for many coastal regions is either outdated or entirely missing.

A promising way to close this gap is satellite-derived bathymetry (SDB), which leverages multispectral optical imagery to estimate water depth in nearshore areas.

How does it work?

Satellite data has been used to map the ocean floor for several decades using techniques such as radar altimetry, wave kinematic, and even space-based lasers. Such methods allowed scientists to generate an approximate model of the ocean floor but the resolution of such methods are not enough for shallow waters (down to a depth of  30m).

Satellite-derived bathymetry uses optical image data to try to estimate the depth of the water in a given place. One of the most effective algorithms to do that is called multispectral signal attenuation, which involves analyzing imagery using a combination of spectral bands. As different wavelengths penetrate the water to a greater or lesser degree, the light attenuation can be measured, and elevation ratios are created by analyzing the colour profile and spectral characteristics of coastal areas with known depths. Algorithms are then used to infer water depth from new spectral information by comparing it to known depths of similar areas. With this approach, it is possible to estimate the depth of the water, even down to 30 meters, with a high level of accuracy.

Satellite-Derived Bathymetry - explanation

Source: USGS: https://www.usgs.gov/special-topics/coastal-national-elevation-database-%28coned%29-applications-project/science/satellite

The better the resolution and the more spectral bands used in the data, the more accurate the results will be, but even lower resolution data can provide valuable information.

Today, data for SDB is measured from various satellite sources, such as Landsat 8 (NASA), Sentinel-2 (ESA), and Pleiades and Pleiades Neo (Airbus). Pleiades Neo features a new spectral band called deep blue with wavelengths of 400-450 nm, specifically designed for satellite-derived bathymetry and atmospheric corrections. This new spectral band allows for much better interpretation of optical data for shallow waters than previously possible.

What are the benefits of SDB?

SDB is a cost-effective and time-efficient alternative to traditional on-site methods. It doesn’t require ships or human divers, making it a safer method for data collection, and also dramatically reducing the environmental impact caused by more intrusive methods.

Airborne LiDAR bathymetry is another technique to measure the seafloor from the skies. It uses an active sensor which beams a green laser light, as opposed to a passive optical satellite sensor. The use of LiDAR gives higher accuracy than SDB, but at a much higher cost, as it requires dedicated airborne survey and only produces data for a relatively small area. SDB, on the other hand, utilizes existing satellite infrastructure, the images cover a wider area, and data can be captured multiple times a day, meaning it has both repeatability and scalability. It is also time-efficient and can provide near-real time monitoring, making it ideal for regular oceanographic management.

SDB is not without its disadvantages. For instance, it can lack spatial resolution when compared to other methods. Depending on the type of technology used, satellite imagery may be affected by weather conditions, water quality, and the presence of vegetation. It has limited effectiveness in deeper waters. Despite these limitations, SDB is a powerful tool for measuring shallow waters, and can be used very effectively in combination with other methods—above all because the data can be captured with such regularity.

Satellite-Derived Bathymetry - products

Source: https://business.esa.int/projects/international-satellite-derived-shallow-water-bathymetry-service

Applications of SDB

SDB in coastal regions can play a key role in a variety of practical applications that are essential to our current way of life and preservation of our future.

Bathymetry is an important tool for many industries. In terms of trade and logistics, accurate bathymetric data is essential for ships and port authorities to ensure safe navigation. Sea transportation is an essential component in over 90% of international trade, and the ships used to transport these goods are growing ever bigger. Understanding the shape of the ocean floor is a vital input for models of ocean currents and waves, and therefore bathymetry can help to improve sea-state monitoring, including wave prediction and more. For the telecommunications industry, bathymetry is used to plan the routes of subsea fiber-optic cables, avoiding coral reefs, shipwrecks, sensitive areas and geological obstructions. Around 474 of these cables are responsible for transmitting 99% of international data (according to TeleGeography, 2021).

Map of global subsea fiberoptic cables. Source: www.submarinecablemap.com

Energy companies also rely on bathymetry to explore and extract oil and gas, as well as using it to secure new energy resources like wind and ocean energy. It is also used to monitor sedimentary movements in the corridor between the coast and the energy exploitation site to ensure the integrity of cables connecting wind turbines to the shore. These cables are a critical part of the infrastructure for wind energy—which is the fastest growing blue economy sector, as of 2019.

Bathymetry is also used to estimate and control the volume of materials extracted from the seabed and monitor changes caused by dredging activities. Dredging is essential for many industrial and civil engineering applications, but can contribute to coastal erosion and environmental damage if not carefully managed. More broadly, bathymetry is used to better understand coastal erosion caused by both natural and human factors, and can aid in better management of our coasts, including the development of innovative coastal defense systems for risk mitigation and post-crisis analysis.

Accurate and precise knowledge of the seabed is also key in protecting undersea conservation sites and archaeological areas. One way that bathymetry data can help with this is by modelling underwater noise caused by human activity. Furthermore, the technology contributes to food security, helping to identify locations for offshore aquaculture cages, habitat mapping and impact modeling to meet the increasing demand for seafood.

Bathymetry even aids in issues of sovereignty: sovereign waters are defined by the limits of the continental shelf, as dictated by the United Nations Convention on the Law of the Sea (UNCLOS). Bathymetry helps to establish these boundaries and justify the rights of a coastal state to explore (and exploit) its maritime territory.

Satellite-Derived Bathymetry: An essential tool for the future

As climate change intensifies and coastal zones face increasing pressure, monitoring shallow waters is becoming more critical than ever. Satellite-derived bathymetry is emerging as a key instrument for hydrographers and coastal managers as the technology advances.

Its strengths—remote deployment, cost efficiency, safety, scalability, and minimal environmental impact—make SDB a compelling alternative to conventional surveys. Although accuracy and depth limitations remain, ongoing innovation promises substantial improvements.

The rise of affordable small satellites and expanded spectral capabilities—such as Pléiades Neo’s Deep Blue band—are accelerating SDB’s development. Future integration with computer vision and deep learning models is likely to further enhance performance.

With its ability to monitor coastal waters remotely, efficiently, and at scale, SDB is set to become a cornerstone technology for understanding and managing one of our planet’s most vital and vulnerable environments.


Did you like this post? Follow us on our social media channels!

Read more and subscribe to our monthly newsletter!

Say thanks for this article (0)
Our community is supported by:
Become a sponsor
#Space
#Space
From Satellites to Soil: Europe’s Quiet Revolution in Measuring Nature’s Comeback
Avatar for Michael Anranter
Michael Anranter 09.16.2025
AWESOME 5
#Insights #Science #Space
Neo Space Group Finalizes UP42 Acquisition: What it Means for Earth Observation Industry
Aleks Buczkowski 07.10.2025
AWESOME 6
#Science #Space
Geospatial for Good: Help us showcase the positive impact of Geospatial on the world
Avatar for Muthu Kumar
Muthu Kumar 05.20.2025
AWESOME 3
Next article
#Contributing Writers #Deep Tech #GeoAI #GeoDev #Ideas #Insights #Space

Quantum Geospatial: Beyond The Limits of Big Data

For years, geospatial systems have followed a simple idea: collect more data and decisions will improve. Cities now use sensors, phones, cameras, and satellites to build digital twins that simulate how urban systems behave. At small scales, this works well. But when we try to model entire cities, the system starts to break down.

The problem is no longer collecting data. It is processing it fast enough to make useful decisions.

Classical computers solve problems step by step. To optimize something like traffic, they evaluate one route, then another, then another. This works for a small area, but as cities grow, the number of possible paths increases extremely fast. Research on large urban systems shows how complexity increases as more components are modeled together. When moving from individual systems to entire cities, the number of interactions between transport, infrastructure, and environmental factors grows rapidly, making real-time analysis significantly harder for classical approaches.

Because of this, most systems rely on shortcuts. These heuristics find good enough answers, but not always the best ones. In traffic systems, this can create cascading problems. A route that looks efficient for one driver may create congestion when thousands follow it. Work on large-scale mobility systems highlights a key difficulty: optimizing routes for many vehicles at the same time without shifting congestion from one part of the network to another.

Another challenge is complexity. City systems depend on many factors at once, including weather, time of day, events, and infrastructure changes. Classical systems struggle to link these variables together. Research on high-dimensional geospatial modeling shows how quickly these interactions become difficult to compute. As the number of variables increases, the system must consider many more combinations at once, which significantly increases computational cost and makes real-time processing challenging.

Even when solutions exist, they are expensive. Large-scale optimization systems require significant computing power and energy. Advances like high-speed route optimization methods improve performance by up to two orders of magnitude, but they do not remove the underlying scaling problem. Similar challenges arise in broader discussions of geospatial big-data systems and route optimization at scale.

A simple way to understand this is to think of a maze. A classical computer tries to solve it by exploring one path at a time. That works when the maze is small. But for a city, there are too many paths to explore.

This article looks at what happens when we try a different approach. It explains how quantum computing changes the way problems are represented, where it may help in geospatial systems such as optimization and pattern detection, and why the future is likely to be a mix of classical and quantum methods rather than a full replacement.

A Different Approach: Quantum Computation

Quantum computing changes how problems are represented rather than just speeding them up. Instead of bits that are either 0 or 1, it uses qubits, which can exist in multiple states at once. This idea, called superposition, allows the system to encode many possible solutions simultaneously rather than checking them one by one. IBM’s overview of quantum computing describes how this shifts computation from step-by-step search to working with probability distributions.

Quantum Geospatial: Beyond The Limits of Big Data

Source: Fermilab

This matters for geospatial problems because many of them involve exploring large solution spaces. Instead of evaluating each route or configuration separately, quantum systems reshape the search so that better solutions become more likely. In simple terms, it is less about trying every path and more about guiding the system toward good ones.

Two additional properties make this possible. Entanglement links variables together so changes in one part of a system affect another, which is useful for modeling connected urban systems. Interference helps amplify better solutions while reducing weaker ones. Comparisons of quantum vs classical computation explain how this differs from traditional trial-and-error approaches.

These ideas are already being explored in real scenarios. Experiments show how quantum methods can be applied to routing problems, though still at a limited scale.

Quantum systems are still early and technically challenging. They require specialized hardware and are often accessed through cloud platforms rather than owned directly. But they introduce a different way of thinking about computation, one that is especially relevant for problems where the number of possibilities becomes too large for classical systems to handle efficiently.

Optimization at Scale

Optimization means finding the best way to do something, like routing vehicles through a city. This becomes difficult when thousands of vehicles are involved, because the number of possible routes grows very quickly.

Early experiments show how quantum methods can help. Work like Volkswagen’s traffic optimization project explored routing for large fleets, where each vehicle is assigned a different path instead of sending everyone through the same “fast” route. This helps reduce congestion rather than shifting it. Volkswagen’s Lisbon quantum shuttle experiment, a real-world trial, demonstrated this idea at smaller scale, where buses were dynamically routed based on city-wide traffic conditions.

Similar ideas are being explored in logistics. Studies on quantum approaches for logistics optimization and industry work like DHL’s technology-driven delivery systems show how routing, fuel use, and delivery efficiency can be improved by better optimization methods.

These systems are still early, but the direction is clear. Instead of optimizing one vehicle at a time, the system treats the entire city or network as a connected problem and finds more balanced solutions.

High-Dimensional Data and Pattern Detection

Modern satellites capture far more than standard images. Hyperspectral data includes hundreds of spectral bands, forming what are often called data cubes. These allow detection of subtle patterns such as vegetation health or water quality. Research on quantum machine learning for Earth observation data highlights how complex these datasets can become. Hyperspectral imagery can contain hundreds of spectral bands per pixel, creating high-dimensional data where patterns depend on correlations across many layers rather than a single image.

The challenge is scale. Processing these high-dimensional datasets is computationally expensive, and classical systems often struggle to extract patterns efficiently. Work on quantum models for satellite imagery analysis explores how quantum approaches may handle these feature spaces differently.

One advantage is pattern detection. Instead of relying on large labeled datasets, some quantum approaches aim to identify structures with fewer examples. Quantum methods for hyperspectral analysis suggest that subtle signals, which may be missed by classical models, can be captured more effectively.

These methods are still experimental, but they point toward faster analysis of large geospatial datasets and improved detection of small or complex changes, especially in environmental monitoring scenarios.

The Current Reality: Hybrid Systems

Quantum computing is still in an early stage. Today’s systems, often described as noisy intermediate-scale quantum (NISQ) devices, are sensitive and not fully reliable on their own.

Because of this, most real use cases follow a hybrid approach. Classical systems handle data processing, storage, and user interaction, while only the hardest optimization tasks are sent to quantum hardware. For instance, the AWS Braket platform shows how both systems can work together rather than replacing each other.

This setup allows gradual adoption. PennyLane, the tool for hybrid quantum models, and frameworks such as Qiskit let developers experiment with quantum methods alongside classical systems. It also lowers the barrier to entry. Cloud-based access means organizations can test quantum approaches without building their own hardware.

In practice, quantum computing is not replacing geospatial systems. It is becoming a supporting layer for problems that are difficult for classical methods alone.

Developing Practical Intuition

Quantum computing is not needed for every problem. For small tasks, classical systems are still faster, cheaper, and more reliable.

It becomes relevant when there is a clear computational bottleneck. This usually appears in problems with large search spaces, high-dimensional data, or strict time constraints. Discussions on quantum advantage in complex systems highlight where classical methods begin to struggle.

Typical signals include:

  • very large optimization problems (e.g., routing at city scale)
  • datasets with many layers (e.g., hyperspectral imager)
  • situations where approximate solutions are not sufficient
  • strong interactions between many variables

In these cases, hybrid or quantum-inspired methods may offer an advantage. Insights from advanced quantum-enabled geospatial analysis show that these approaches are explored through hybrid workflows, focusing on optimization and high-dimensional problems. The key takeaway is that quantum methods are being used selectively, not as full replacements for classical systems.

A practical starting point is not full quantum adoption, but preparation. This includes structuring data, identifying bottlenecks, and experimenting with hybrid methods so integration becomes easier as the technology matures.


Geospatial systems are moving beyond collecting data toward understanding complex systems in real time. Classical methods took us far, but they struggle as scale and complexity increase.

Quantum approaches introduce a different way to handle these challenges. Not by replacing existing systems, but by extending what is possible in optimization and high-dimensional analysis.

The change is happening slowly, but it’s important. It includes geospatial analysis like routing through cities and remote sensing detection of changes in the environment. The goal is not just to compute faster, but to make better decisions.

The future of geospatial intelligence will not be defined by a single technology. It will be shaped by how classical and quantum systems work together to understand and manage the world more effectively.


Did you like this article? Read more and subscribe to our monthly newsletter!

Read on
Search