Geoawesome · Blog and Community

The AI-Ready Spatial Stack:
How Wherobots and Felt Are redefining GIS

An 11-question investigation into the Wherobots–Felt partnership: what problem it solves, how the architecture works, and where spatial intelligence is actually heading.

Topics: Cloud GIS · Modern GIS · GeoAI · Data Engineering · Collaborative Mapping
BP
Ben Pruden
Wherobots · Head of Marketing
JS
Jaime Sanchez
Felt · Director of Partnerships
AI Context Engine for the Physical World. Built by the original creators of Apache Sedona - open standard for distributed geospatial computing.
2022Founded
62M+Sedona downloads
$21.4MSeries A
The "Google Docs of GIS." Browser-native collaborative mapping platform with AI-assisted analysis.
2021Founded
$19.5MRaised
Oakland, CA
When Wherobots and Felt announced their strategic partnership in February 2026, the geospatial community paid attention. Not because a partnership between two tech companies is unusual, but because this one targets a real, expensive, and frustrating problem: the gap between where spatial data lives (cloud lakehouses, at petabyte scale) and where it needs to go. A map is one destination - but it's rarely the only one, and it's never the point on its own. A real estate analyst needs a full acquisition strategy. An insurance cat risk modeler needs a scoring algorithm that runs continuously across a live portfolio. An agronomist needs yield forecasts that update as conditions change across every field they manage. What all of these have in common is that the underlying data pipelines have to be production-grade and always current - because if the data feeding an application goes stale, the application becomes a toy. Too much geospatial work still ends in one-off "apps" that are outdated within days of delivery. What makes this partnership distinctive is its focus on agent-native workflows: both systems enable users to orchestrate spatial data processing and visualization through AI coding assistants like Claude, Codex, and AWS Strands - not just through traditional UI - so that the agents consuming spatial data can produce multi-faceted, maintained outputs rather than static snapshots. This piece explores that gap through eleven questions, with architecture diagrams, performance benchmarks, code examples, and a hard look at what "AI-ready spatial data" actually means in practice.
Traditional GIS vs. Modern Spatial Stack
Traditional GIS Approach
  • Desktop software (e.g.: ArcGIS, QGIS) with local file limits
  • Data exported as Shapefiles, CSVs: emailed or shared via FTP
  • No native connection to Databricks, Snowflake, or S3
  • CRS transformations handled manually per project
  • Scaling-out means buying more desktop licences
  • AI/ML requires custom integrations, separate pipelines
  • Collaboration via screen-sharing or static PDFs
  • Raster processing bottlenecked by single-machine RAM
Modern Spatial Stack (Wherobots + Felt)
  • Fully browser-based; any device, no install required
  • Live queries direct from S3, Databricks Unity Catalog, Iceberg
  • Native lakehouse integration - data never leaves your cloud
  • CRS and format conversions handled automatically at query time
  • Serverless, pay-as-you-go distributed compute
  • AI inference (change detection, segmentation) built in via RasterFlow
  • Shareable maps via link - editable, commentable, versioned
  • GPU-, CPU-, and memory-optimised raster workloads at planetary scale
How the Integration Works
Data Layer
Amazon S3
Databricks Unity Catalog
Apache Iceberg / Parquet
GeoTIFF / Zarr / GeoJSON
↓   WherobotsDB connects directly - no data copy
Computation
Spatial AI Coding Assistant & MCP Server
WherobotsDB (Optimzed Apache Spark + Sedona)
300+ vector & raster functions
RasterFlow (EO Inference)
GeoAI / PyTorch Models
↓   Spatial API + Wherobots API Keys → Felt Enterprise
Application
Felt Browser Map
AI-Assisted Analysis
Shareable Dashboards
Collaborative Editing
↓   Link-based sharing - viewable on any device
End Users
GIS Analysts
Business Teams
Field Operations
Policy & Decision Makers

↳ Data sovereignty preserved: compute runs in customer cloud; nothing migrated to third-party storage

11 Questions on the Modern Spatial Stack

The core issue is scale. Most legacy GIS workflows were designed when a "large" dataset meant a few hundred megabytes of vector features or a small set of raster tiles. Today, organizations are routinely working with agricultural telemetry at regional scale, satellite imagery updated daily, or building footprint datasets covering entire continents. Desktop tools hit a wall at a fraction of this volume, and that wall shows up as hours-long processing jobs, crashes on large joins, and analysts who've learned to pre-filter data just to make their software usable.

Beyond the raw size, the integration problem is equally serious. Spatial data in modern enterprises typically lives in Databricks, Snowflake, or S3 - not in GIS-specific file stores. Traditional tools don't connect to these systems natively. The result is a painful export-process cycle: data gets pulled from the lakehouse, converted to a shapefile or geodatabase, loaded into the GIS tool, analyzed, exported again, and pushed somewhere else. Every step introduces latency, potential errors, and a version control nightmare.

"Analyzing spatial data at scale has historically required specialized GIS software or custom applications - tools that were often cumbersome and difficult to connect with modern data systems." - Wherobots / Felt Partnership Announcement, February 2026 ↗

Finally, there's the AI and collaboration gap. Today most modern data systems are enabled by AI assistants that write SQL or Python, and increaseingly this is what users expect. Most legacy GIS tools aren't trained to handle AI native spatial operations, and they treat sharing as an export action: you write custom operations, often in the browers to produce a static PDF, a screenshot, or at best a packaged map file. There's no concept of a live, linked map that multiple people can view, annotate, and update simultaneously, powered by workflows that are agent native.

All three - and that's what makes "AI-ready" a meaningful claim here rather than a slide-deck abstraction. The term refers to three distinct AI integration points in the stack.

Processing-level AI: Wherobots' RasterFlow runs custom or open PyTorch models directly on Earth observation data - change detection, land cover segmentation, object classification. The pipeline takes raw, unprepared imagery, builds inference-ready mosaics, runs the model, and delivers results as geometries in Apache Iceberg tables. Engineers don't need to set up GPU clusters or manage tiling pipelines; the infrastructure abstracts that away entirely. Output lands back in the customer's lakehouse, ready for downstream queries.

Agent-level AI: Wherobots recently launched a Spatial AI Coding Assistant for VSCode and an MCP (Model Context Protocol) server, letting LLMs execute complex spatial data science and engineering workflows through natural language. An agent can take a prompt like "find all parcels within 500 metres of a flooded river that changed land cover in the last 90 days" and translate it into distributed spatial SQL, Python queries, or RasterFlow operations - or build repeatable data pipelines and jobs without requiring manual query authoring. The intent is clear: Wherobots is positioning itself as essential infrastructure for AI systems that need to deterministically reason about and analyze the physical world.

Interaction-level AI: On the Felt side, AI-assisted map development lets users ask questions about their data in natural language - effectively "building a map." This goes well beyond a search bar; it's closer to a GIS analyst assistant embedded in the interface. Felt's AI layer can suggest relevant layers, interpret spatial patterns, and help non-technical users extract insight without needing to understand the query logic underneath.

"So far, AI has been really good with language, responding to us. The next frontier is spatial: understanding the physical world, not just the textual one." - Mo Sarwat, CEO, Wherobots ↗

The integration is API-driven and asynchronous - meaning Felt doesn't directly embed Wherobots' compute. Instead, Wherobots exposes a Spatial API and REST endpoints that Felt Enterprise can connect to using Wherobots API keys. When an analyst defines a layer in Felt that references a Wherobots data source, the query runs in the Wherobots cloud environment against the customer's own S3 or Databricks storage, and the resulting spatial features are streamed back to the Felt rendering layer. Wherobots also manages a native Iceberg catalog, allowing secure connection to any Iceberg catalog via service principals or tokens - with AWS Glue Catalog native integration coming soon, a Snowflake Polaris catalog connector to follow, and a Wherobots-hosted Iceberg REST catalog on the near-term roadmap that will let users expose their Wherobots Iceberg tables as REST endpoints consumable by any Iceberg-compatible service.

Spatial SQL · Example Wherobots Query
-- Processing agricultural parcel data with vegetation index
SELECT
  parcel_id,
  ST_AsGeoJSON(geometry) AS geom,
  RS_NormalizedDifference(b8_band, b4_band) AS ndvi,
  crop_type,
  farm_id
FROM wherobots.leaf_lake.parcel_sentinel2
WHERE ST_Intersects(geometry, ST_GeomFromText('POLYGON(...)'))
  AND acquisition_date = '2024-10-01'
-- Returns: GeoParquet → Felt layer in < 2s for continental coverage

The architectural elegance here is what the team calls "zero data movement." The geospatial data never leaves the customer's cloud environment. Wherobots processes the query in-place against data in Amazon S3 or Databricks Unity Catalog. Only the query results - the specific features the analyst needs - are transmitted to Felt's rendering engine. This matters for organizations with data sovereignty requirements: both Wherobots and Felt can be deployed within the customer's own VPC, meaning sensitive spatial data (parcel boundaries, infrastructure locations, movement data) remains entirely within the customer's controlled environment at every stage of the workflow.

On the Felt side, the Python SDK and REST API allow data engineers to programmatically define which layers are available to analysts, set refresh cadences, and configure access controls all without requiring the business-side users to understand what's happening under the hood. In summary: engineers configure the connection once, analysts get a fast, interactive map.

The phrase "cloud-native GIS" has been stretched to cover a lot of different things - some legitimate, some marketing. The typical cloud GIS offering is essentially a legacy desktop platform wrapped in a web container, hosted on AWS or Azure. The underlying engine may still rely on single-server processing; it's just no longer on a machine under your desk. That distinction matters because the performance ceiling doesn't move much. A cloud-hosted single-node process still crashes or slows to a crawl when you try to spatial-join 400 million GPS pings with a national parcel dataset.

The architectural gaps are specific. First: distributed raster processing. Most cloud GIS platforms treat raster data - satellite imagery, elevation models, sensor grids - as a second-class citizen, applying vector-centric logic to fundamentally different data structures. Wherobots handles multi-modal workloads natively, with separate GPU-, CPU-, and memory-optimised compute paths depending on whether you're doing imagery classification (GPU), vector operations (CPU), or in-memory joins (RAM). Second: lakehouse connectivity. Established platforms typically require data to be imported into their own proprietary storage. The modern stack queries data where it already lives.

5–20×
faster than comparable cloud GIS workflows
60%
cost reduction vs. Databricks-native spatial ops
300+
pre-built spatial SQL functions out of the box. Source: Wherobots ↗

Third, and often overlooked: the collaboration layer. Most GIS cloud platforms still model collaboration as "view access to a server-hosted map" - a portal with logins, not a live shared document. Felt's model is closer to Google Docs: the map is a shared artifact that multiple people can edit, comment on, and diverge from simultaneously. For teams that span engineering, operations, and business stakeholders, the difference is significant.

The stack handles raster, vector, and structured data - the full spatial trinity - and the benchmarks are not theoretical. On the vector side, the system routinely processes continental-scale building footprints, national road networks, and billions of GPS telemetry points. The Overture Maps Foundation - which is building an open reference map of the entire world - uses Apache Sedona and WherobotsDB as core processing components. Overture runs its buildings data pipeline through WherobotsDB every week, a pipeline that covers over 2.75 billion buildings worldwide and includes conflation across multiple sources. A benchmark that illustrated the real-world impact: converting a large-scale geospatial dataset from standard Parquet to GeoParquet cut query time from 1.5 hours to approximately 3 minutes.

For raster data, Wherobots' RasterFlow pipeline was purpose-built to handle Earth observation datasets that crash conventional tools: full-resolution Sentinel-2 mosaics, multi-temporal GeoTIFF stacks, NetCDF climate datasets. GPU-optimised workloads run model inference (change detection, land cover classification) directly on these datasets without first needing to tile, reproject, or otherwise pre-process them into something a desktop tool could handle. Supported formats include GeoTIFF, Zarr, NetCDF on the raster side, and GeoParquet, Shapefiles, GeoJSON, and Parquet with GEO types on the vector side.

"Wherobots is up to 10× faster than open-source Apache Sedona for geometry processing, and up to 20× faster for raster data processing." - Wherobots technical documentation ↗
"We accelerated the pipelines that produce the buildings dataset by up to 20x after we moved them to Wherobots, which required a simple redirection of our code. We retained compatibility with Apache Sedona, and the move put us into a development experience that's made us more productive." — Jennings Anderson, Geoscientist at Overture & Data Engineer at Meta, via Wherobots Series A announcement ↗

One important thing to keep in mind: LiDAR point clouds are the notable current limitation. The stack handles virtually every other spatial format at scale, but full-resolution point cloud processing (LAS/LAZ) isn't natively supported in the current architecture. Organizations working with drone lidar or airborne surveys will still need specialist tools for that specific data type.

The fundamental difference is that Felt treats a map as a living document rather than a snapshot. Traditional GIS sharing is essentially a publishing workflow: you complete the analysis, package the output, and distribute it - at which point it's static, frozen in time, and one-directional. Anyone who wants to interact with it needs either the original software, or a read-only portal. Neither option scales well to organizations where the people who need maps outnumber the people who can make them.

Felt's model is closer to how collaborative documents work in Figma or Notion. A map is a shared URL. Anyone with access can view it on any device, leave comments on specific geographic features, add their own layers, and see changes appear in real time. The originating team can configure who can view versus edit, but the default interaction model is participatory, not broadcast.

For Leaf Agriculture - the case study the partnership frequently highlights - the operational difference was concrete: instead of scheduling screen-sharing calls to walk stakeholders through new data products, the Leaf team now distributes a link. Field agronomists can view up-to-date vegetation indices on a phone. Decision-makers at the farm level can interact with parcel-level data without needing any GIS knowledge. The engineering team never needs to export or reformat data for downstream audiences; the Wherobots–Felt connection keeps the visualization synchronized with the processed data automatically.

One nuance worth noting: Felt's collaboration model requires an Enterprise plan to connect to Wherobots API keys. The collaboration features themselves are available on lower tiers, but the full Wherobots integration - the part that makes petabyte-scale processing accessible to business users - sits at the enterprise pricing level, which shapes where in the market this stack realistically lands.

Data governance in the lakehouse model is handled at the infrastructure layer, not bolted on as an afterthought. When Wherobots connects to Databricks Unity Catalog, it uses the customer's own Databricks service principal and OAuth or PAT token authentication. That means Wherobots inherits whatever access control policies already exist in Unity Catalog - column-level permissions, row-level filtering, table-level access grants - without requiring any duplication of governance logic in a separate GIS platform. The spatial query runs with exactly the same permissions as any other Databricks workload from that credential.

For Amazon S3 connections, data stays in the customer's S3 bucket. Wherobots processes queries in-place using IAM-based authentication. This is the "zero data movement" guarantee that the partnership emphasizes for regulated industries: healthcare spatial data, government parcel records, financial location data - none of it passes through external storage at any point in the pipeline.

Versioning is handled through open table formats. Wherobots drove GEO type support for Apache Iceberg and Apache Parquet, which means spatial datasets stored in Iceberg tables benefit from Iceberg's native time-travel capabilities - the ability to query a dataset as it existed at any previous point in time. For teams managing frequently updated datasets (monthly satellite composites, quarterly parcel updates, daily IoT feeds), this provides an audit trail without custom version management tooling.

Wherobots' architecture received SOC 2 attestation, and their documentation emphasizes the security-first design as a prerequisite for supporting enterprise workloads. The combination of lakehouse-native auth, zero data movement, and open-standards table formats is a more defensible governance posture than most GIS-specific platforms, which typically maintain their own proprietary permission systems.

The intended division is clean in theory: data engineers use Wherobots to process and curate spatial data products - running joins, building derived datasets, orchestrating inference pipelines - and then publish those products as queryable layers accessible in Felt. Business analysts, planners, and operational teams interact with those layers through Felt's visual interface without needing to understand the underlying query logic or infrastructure. The model mirrors how a well-run analytics platform separates the "data producers" from the "data consumers."

In practice, friction still exists at a few specific boundaries. The first is layer definition complexity. An analyst who wants to customize what data is shown - apply a different filter, change the spatial aggregation level, combine two data sources - has traditionally needed to involve the data engineering team or write Spatial SQL themselves. Felt's AI assistant is actively closing this gap: the natural language interface handles common query patterns well, and the team is steadily expanding the range of spatial questions it can interpret without manual SQL authoring.

Felt SQL · Business density analysis by county - spatial join example
WITH business_in_counties AS (
  SELECT
    counties.name                                          AS county_name,
    counties.geometry                                      AS county_geom,
    COUNT(businesses.business_id)                         AS total_businesses,
    AVG(businesses.revenue)                               AS avg_revenue,
    MODE() WITHIN GROUP (ORDER BY businesses.business_type) AS most_common_business_type
  FROM california_county_geoms AS counties
  JOIN california_businesses AS businesses
    ON ST_Contains(counties.geometry,
         ST_SetSRID(ST_MakePoint(businesses.longitude, businesses.latitude), 4326))
  GROUP BY counties.name, counties.geometry
)
SELECT
  county_name,
  county_geom,
  total_businesses,
  ROUND(avg_revenue, 2)  AS avg_revenue_dollars,
  most_common_business_type
FROM business_in_counties
ORDER BY total_businesses DESC;
-- Runs directly in Felt SQL → returns a styled choropleth-ready layer

The second friction point is debugging. When a Felt layer returns unexpected results – missing features, incorrect geometries, stale data – tracing the issue back through the Wherobots query layer to the original data source requires technical knowledge that most analysts don't have. Observability tooling is improving, but the debugging experience across the two platforms isn't yet seamless.

The third is the distinction between connecting the platforms and setting up Wherobots itself. Connecting Wherobots to Felt is straightforward – once an API key is in place, layers flow through with minimal configuration on the Felt side. The steeper technical investment is in Wherobots: managing authentication against Unity Catalog or S3 and structuring data pipelines for production use. Infrastructure itself is fully managed – teams select a runtime size (tiny, small, medium, large, and so on) and Wherobots handles everything underneath, with guidance available on sizing for different workload types. That's still a data engineering skill set in terms of pipeline design, but the operational overhead is substantially lower than self-managed alternatives. Looking ahead, a native jobs interface – agentically constructable – is planned for later this year, which will allow end-to-end job orchestration within Wherobots even across connected infrastructure. Wherobots' Community Edition lowers the barrier for exploration, but teams without data engineering capacity should factor pipeline design into their adoption planning.

Leaf Agriculture is the most detailed public case study and it's instructive. Leaf provides a unified API for agriculture organizations working with telemetry data from tractors and field sensors - a domain that generates enormous volumes of spatial data (GPS tracks, sensor readings, parcel boundaries) that need to be processed and visualized for customers at scale. Their previous workflow involved significant manual effort to transform raw data into anything an end customer could actually use.

After adopting the stack, Leaf reports improvements across three dimensions: cost efficiency, scale, and the ability to innovate without proportionally growing the team. The most operationally striking change: instead of in-person screen-sharing sessions to review new data products with farm customers, the Leaf team now distributes links to Felt maps that are viewable from any device. That's not just a convenience - it eliminates a coordination overhead that was bottlenecking their product delivery cadence.

"Felt saves us hours on every analysis and has enabled us to build faster without increasing team size." - Leaf Agriculture team, via Felt partnership page ↗
"With Wherobots on AWS, not only can we easily scale to millions of acres and continuous tractor telemetry normalization within LeafLake, but also we can rest assured that our costs won't spiral out of control." — G. Bailey Stockdale, CEO, Leaf Agriculture, via WherobotsDB price performance post ↗

LeafLake - Leaf's recently announced data product built on the Wherobots stack - is a direct output of this architecture. It's a spatial data lake product that Leaf can now offer to agriculture customers, enabled by Wherobots' ability to process massive datasets of agricultural, parcel, and tractor telemetry at scale. That product likely wouldn't have been feasible on a legacy GIS architecture within Leaf's team size or budget constraints.

More broadly, organizations adopting the stack tend to report similar patterns: a reduction in the time between "data is available" and "stakeholders can see it," a reduction in the engineering effort required to serve spatial insights to non-technical users, and an increase in the number of spatial questions that can realistically be answered without a multi-day analysis project.

For many organizations, Wherobots and Felt complement existing GIS investments - particularly for teams working with large-scale cloud data, real-time pipelines, or modern collaboration workflows that weren't the focus when legacy platforms were designed. That said, some organizations have chosen to consolidate onto this stack entirely, especially where spatial data lives in Databricks or S3 and distributed query performance matters. The right answer depends on your team's specific workflows and what you're optimizing for.

The stack adds the most value where cloud-native performance and collaboration are the priority: large-scale raster processing, real-time data pipelines, sharing spatial insights with non-GIS audiences, and embedding mapping into broader data engineering workflows. These are areas where the Wherobots–Felt combination addresses needs that weren't central to how earlier platforms were architected.

That said, the migration path is not uniform. Organizations doing precision cartographic work to regulatory standards, or operating within government procurement frameworks that mandate specific certified toolchains, will move more slowly - not because the stack can't serve those needs, but because institutional change cycles are long. For those teams, a hybrid approach - running the modern stack for high-volume analytics while maintaining legacy tools for specific certified outputs - is a realistic interim state, not a permanent architecture.

The question isn't which platform is better in the abstract - it's which combination of tools best serves your team's workflows today and where you want to be in three years.

This question gets at something important that the technology conversation often leaves out: awareness and translation. Wherobots and Felt are building technically sophisticated systems, but the people who most need to understand what those systems enable are often not the ones reading Apache Sedona release notes or AWS Marketplace listings. They're urban planners, agriculture ministers, climate adaptation officers, infrastructure directors - people who make decisions that are fundamentally spatial but who don't think of themselves as "geospatial professionals."

Platforms like Geoawesome occupy a specific and valuable position in this ecosystem: technically credible enough to be trusted by practitioners, but accessible enough to reach the decision-maker and policy audience. The translation feature is a significant challenge. Explaining why distributed spatial SQL matters to a municipal planning director requires framing the problem in terms they care about - not query latency, but how quickly their team can assess flood risk across 200,000 parcels after a storm event. That framing work is where editorial platforms add structural value that product companies can't replicate through their own marketing.

There's also a specific role in connecting the stack to education, research institutions and universities ↗. The Wherobots–Felt architecture is well-suited to academic geospatial research - planetary-scale datasets, reproducible workflows, open standards - but university GIS programs are still largely teaching ArcGIS. Coverage that helps researchers and educators understand what the modern stack makes possible accelerates the pipeline of practitioners who are fluent in it. Apache Sedona downloads popularity suggests the developer community is already there; the GIS professional community is catching up more slowly, and media coverage is part of how that gap closes.

"Maps are no longer optional; they're critical tools for rapidly understanding our place on earth. But building and scaling them has traditionally required heavy engineering effort." - Mo Sarwat, CEO, Wherobots blog ↗

On a broader scale, the spatial intelligence sector is undergoing a period of significant change and potential. The convergence of better cloud infrastructure, open data formats, AI inference capabilities, and collaborative interfaces is creating genuinely new possibilities - things that weren't feasible five years ago. Platforms that can communicate that shift clearly, and connect it to real-world applications that policy audiences care about (climate resilience, food security, urban growth, infrastructure monitoring), help ensure that investment, regulation, and institutional adoption keep pace with what the technology can actually do.

Key Terms Explained
Spatial SQL
An extension of standard SQL that adds geometric data types (points, lines, polygons, rasters) and spatial operations (ST_Intersects, ST_Distance, ST_Union). Allows querying geographic relationships using the same query language data engineers already use for tabular analytics.
Apache Sedona
Open-source cluster computing framework that adds first-class geospatial support to Apache Spark and Apache Flink. Created at Arizona State University in 2017 ↗, now part of the Apache Software Foundation. Wherobots is the managed cloud service built on Sedona.
Agent-Native Workflows
Systems designed to be orchestrated, queried, and controlled by LLM-powered agents and coding assistants, rather than only by human UI interaction. Wherobots' MCP server and Felt's API-first design enable agent-native spatial analysis—agents can autonomously execute spatial queries, build pipelines, and generate visualizations through natural language.
Lakehouse
Architecture that combines the scalable storage of a data lake (e.g. Amazon S3) with the query performance and governance features of a data warehouse. Platforms like Databricks and Apache Iceberg implement this model. Wherobots connects directly to lakehouse storage without requiring data migration.
AI Inference
Running a trained machine learning model against geospatial data - typically satellite or aerial imagery - to extract structured information. Common tasks: change detection (has this area changed between two dates?), land cover classification, object detection. Wherobots RasterFlow operationalizes this at continental scale.
GeoParquet
An extension of the Apache Parquet columnar file format that adds native support for geospatial data types and spatial indexing. Enables cloud-optimised storage and fast spatial queries without specialist GIS file formats. Wherobots has driven GEO type adoption in both Parquet and Apache Iceberg.
CRS (Coordinate Reference System)
The mathematical framework that maps coordinates to actual locations on Earth's curved surface. Data from different sources often uses different CRS standards (WGS84, UTM zones, national projections), requiring transformation before layers can be correctly overlaid. A common source of errors in legacy GIS workflows.
RasterFlow
Wherobots' serverless workflow engine purpose-built for Earth observation datasets. Takes raw, unprepared satellite imagery, builds inference-ready mosaics, runs PyTorch model inference (classification, segmentation, change detection), and returns results as geometries in Iceberg tables.
MCP Server
Model Context Protocol - a rising standard for enabling AI language models to call external tools. Wherobots' MCP server allows LLM-based agents to execute spatial queries and data engineering workflows through natural language, opening a path to AI-orchestrated geospatial analysis. Enables both Claude Code and other AI coding assistants to work with spatial data deterministically.
FeltAI
Felt's AI-assisted interface for map development. Enables users and agents to customize and build map visualizations programmatically or through natural language, including layer definition, styling, sharing, and interactive dashboard configuration. Complements Wherobots' MCP integration for end-to-end agent orchestration.
Ready to explore the modern spatial stack?