← January 29, 2027 edition

axionorbital-space

Foundation models for 24/7 Earth observation

AxionOrbital Space Is Turning Radar Into Something Humans Can Actually Read

Deep LearningSatellitesEarth ObservationDefense

The Macro: Most Satellite Imagery Is Useless Most of the Time

There is a dirty secret in the Earth observation industry. Optical satellites, the ones that take the pretty pictures you see on news broadcasts and mapping platforms, cannot see through clouds. They also cannot see at night. Which means roughly 70% of the time, for any given point on Earth, your fancy satellite constellation is effectively blind.

This is not a minor limitation. It is the core constraint that defines the entire industry. Defense and intelligence agencies need persistent surveillance. Agricultural monitoring needs to track crop health through rainy seasons. Disaster response teams need imagery during storms, which is exactly when clouds are thickest. Maritime monitoring needs to track vessels 24/7. And all of them run into the same wall: when conditions are bad, which is when you need imagery most, optical satellites deliver nothing.

Synthetic Aperture Radar (SAR) solves the physics problem. Radar sees through clouds. Radar works at night. SAR satellites like those from Capella Space, ICEYE, and Umbra can image any point on Earth regardless of weather or lighting conditions. The catch? SAR data looks nothing like optical imagery. It is a grayscale mess of speckle noise and backscatter that requires specialized training to interpret. Standard computer vision models break on it. Most analysts cannot read it without extensive experience. The data exists, but the usability gap is enormous.

AxionOrbital Space, out of Y Combinator’s W25 batch, is attacking this gap directly. They build foundation models that translate raw radar backscatter into analysis-ready optical-style imagery. In other words, they take the data SAR satellites produce and make it look and behave like the optical imagery that everyone already knows how to use.

The Micro: Foundation Models for Radar Translation

The company was founded by Dhenenjay Yadav (CEO) and Atharva Peshkar (CTO), both based in San Francisco. The technical challenge they are tackling is significant. SAR-to-optical translation is not a new research area, but building foundation models that do it reliably at scale, across different terrain types, weather conditions, and SAR sensor configurations, is genuinely hard.

The key insight is that if you can make radar data consumable by standard vision pipelines and human analysts, you effectively unlock 24/7 continuous Earth observation without needing to build new satellite constellations. The SAR satellites are already up there. Capella, ICEYE, Umbra, and several government constellations are generating radar data constantly. What is missing is the translation layer that makes that data useful to the vast majority of users who cannot interpret raw SAR.

This positions AxionOrbital not as a satellite company but as an AI infrastructure layer. They do not need to launch anything into space. They need to build models that are good enough that defense agencies, agricultural analytics companies, and disaster response organizations trust the translated imagery for operational decisions.

The competitive environment is interesting. Planet Labs has the largest commercial optical constellation and is adding SAR capabilities. Capella Space is building its own analytics on top of its SAR data. Orbital Insight and Descartes Labs offer geospatial analytics platforms. But none of them are primarily focused on the SAR-to-optical translation problem as their core product. AxionOrbital has the advantage of focus.

Their GitHub presence suggests active open-source work, including a project called “axion-planetary-mcp” that appears related to their foundation model infrastructure. Open-source activity in a space and defense focused company is unusual and suggests they are building community and credibility with the research audience.

The Verdict

The market for persistent Earth observation is large and growing. Governments spend billions on satellite imagery. The commercial geospatial analytics market is expected to hit $15 billion within a few years. And the fundamental limitation that AxionOrbital is addressing, the inability to use SAR data without specialized expertise, is a real blocker for the entire industry.

At 30 days: what does the output quality look like? I want to see side-by-side comparisons of translated SAR imagery versus actual optical captures of the same location. The proof is in the pixels.

At 60 days: who are the early customers? If it is defense and intelligence, the revenue per contract could be massive but the sales cycles will be long. If it is commercial agriculture or insurance, the volume could be higher but the price per unit lower.

At 90 days: how does the model handle edge cases? Urban vs rural terrain. Ocean vs land. Dense forest vs open desert. A foundation model that works great on flat farmland but struggles with complex urban environments is useful but limited.

I think this is a strong technical bet in a market where the demand clearly exists. The question is whether the foundation model approach can deliver the accuracy and reliability that operational users require. If it can, AxionOrbital could become essential infrastructure for an industry that has been waiting for this exact solution.