The future of Remote Sensing
Geoawesome is already among us for 10 years. Time moves fast so I thought it’d be nice if we have a look at the future of Remote Sensing. Where will we be with Remote Sensing in 10 years? You could argue that, at the moment, Remote Sensing is still in its infant stage, maybe adolescent. Wouldn’t it be nice for Remote Sensing to reach maturity in 10 years?
The Magic of Remote Sensing
But what is mature technology anyway? Mature technology is a technology that is out of the way. As Arthur C. Clarke put it:
Any sufficiently advanced technology is indistinguishable from magic.
I still remember the feel of magic that Dropbox evoked when I first used it in 2007 or 2008. The magic was that nothing changed. Your workflow was the same as before using Dropbox. But somehow, some magic technology synchronized all your data to the cloud. You don’t have to do anything or understand how file-synching or the cloud works. Dropbox does that for you.
In the geospatial domain, the best example I could come up with is route planning. Everyone with a smartphone is able to plan routes from A to B without understanding GPS, spatial data formats, or routing algorithms. It just works.
What should happen in the field of Remote Sensing field to reach the same level of adoption?
Remote Sensing is a communication problem
For Remote Sensing to be perceived as magic, it should move out of the way. It should disappear. What is still very much the case in the Remote Sensing field is that a lot of companies focus on attributes of the technology instead of looking at the intricacies of the problem that they should solve.
Overselling is an issue. But also over-perception. Clients hear what they want to hear. They want your technology to be the silver bullet. It’s not. Remote Sensing companies should stop selling Remote Sensing and start selling solutions.
Step 1 in getting a technology adopted is to stop talking about it.
Filling up the gaps
Looking at satellite data providers, the first thing you’ll notice is that each occupies its own niche. There’s the optical versus radar. High resolution versus Very High Resolution. C-band versus L-band versus X- and P-band. RGB versus 8 channel versus hyperspectral. Daily versus monthly. Spaceborne versus airborne. And so on and so forth.
I believe in the future there will be a constant stream of data from remote platforms from your phone to satellites in any spatial resolution you like in any wavelength you like. This is already happening with the rise of privately owned and operated satellite constellations.
What is needed, are industry standards and best practices to store and offer these data in a structured way. Several Analysis Ready Data efforts and projects like STAC & openEO help shape this. Still, a lot of work needs to be done in this field.
Standards and analysis-ready data would have two major spin-offs:
- It allows for the commoditization of satellite data;
- It allows for adoption by non-Remote Sensing creative talent.
I would especially like to emphasize point 2. Call it the democratization of satellite data. If ARD is easily available and easy to use, it will find its way to a much bigger and diverse pool of creative talent than the relatively small Remote Sensing field. That will explode the number of applications and solutions. Some visual designers are already finding their way.
Step 2 is to attract creative talent from outside the traditional Remote Sensing field by filling up the technical gaps and strive for Analysis Ready Data & standards.
Data quality in the post-truth era
Finally, and this is my most speculative point, we have to explore how we can create ‘no questions asked’ trust in Remote Sensing data. A lot of clients ask about data accuracy, precision, and quality. I agree this is an important point. But I often see clients requesting precision whereas they are actually looking for reliability & consistency. I feel there is a lot of bikeshedding around accuracy and precision.
There are, of course, ways to test accuracy and quality. But with the filling-the-gaps trend described above, there will be more and more data from a wide variety of platforms with a wide variety of quality. It’s too costly, not desirable, and not scalable if every solution provider has to set up its own accuracy testing program.
A way to approach this problem could be to get inspiration in how IT solved the authentication & identity problem. In old IT, identity would be tied to a location (LAN, VPN). But with the advance of mobile devices and a plethora of cloud services, that model was untenable. Therefore, they came up with the ‘zero trust’ architecture, where networked devices are not trusted by default. The device is responsible for proving its identity to the network, not the other way around. A similar idea could be applied to suppliers of Remote Sensing data. For this we would need to solve the standards and ARD problem first though.
Step 3 is to explore how to ensure Remote Sensing data quality in analytics pipelines.
10 years ahead
Wrapping up, I think the future of Remote Sensing looks bright. Programs like Copernicus and Landsat, backed by public finance, ensure a long-term, consistent data backbone upon which ecology of platform & solution providers can evolve. Privately owned satellite constellations start filling in the gaps allowing creative talent from non-related sectors to enter the Remote Sensing realm. Standards are starting to emerge enabling Analysis Ready Data. Finally, a decent amount of work has to be done in the reliability and accuracy field to make the adoption of satellite data unquestioned.
These trends provide a push towards the commoditization of satellite data. If Remote Sensing solution providers would then solve their communication problem, I see no reasons why Remote Sensing solutions could not be adopted by a wide range of markets.
Did you like the article? Read more and subscribe to our monthly newsletter!