AI Has Learned to Write. Space Aye Wants It to See

Inside Space Aye’s plan to build a living digital twin of Earth—and teach AI to perceive the planet in real time.
Image may contain Lighting Astronomy and Outer Space
Image: DEPOSITPHOTOS

Wired editorial staff were not involved with the creation of this content.

If you listen to Chris Newlands talk about space, you’ll notice something interesting. He doesn’t sound like a typical entrepreneur chasing the next frontier. He speaks more like an engineer, questioning how the understanding of reality could evolve. He sounds like an engineer explaining how reality itself might need an update.

That mindset has put him in rare company. Often mentioned in conversations about the next generation of space entrepreneurs, Newlands stands out for his precision.

From an office in Scotland, the former Royal Navy accountant turned geospatial architect is building what he calls a “Large Terrestrial Model.” This planetary-scale system is not just another language model with a fancy name. It’s what could give AI a sense of sight. “Maps tell you where you are,” Newlands says. “We show you what’s happening around you.”

That difference between being on the map and being in the image is what Newlands believes will define the next decade of technology. His company, Space Aye, lets satellites merge in-orbit imagery with data from billions of ground-based devices. The result could be a real-time, living model of Earth, constantly updated by the same digital exhaust that drives our phones, cars, ships, and cities.

Image may contain Simon Westaway Blazer Clothing Coat Jacket Face Head Person Photography Portrait and Formal Wear

Chris Newlands, CEO of Space Aye. Photo: Shelby Freeman Photography

How a Navy Accountant Ended Up Reimagining How the Planet Is Mapped

Newlands didn’t plan a career in space. He started with numbers and systems. “I was an accountant in the Navy and then a financial adviser,” he recalls. “I always tried to change big organizations from within but didn’t have the power. Timing matters. Space just happened to be opening commercially, and I saw a way to build something new.”

The opening came unexpectedly. During a meeting with two space scientists, Newlands learned that the European Space Agency was offering equity-free grants for businesses that could use space-derived data in new ways. “They didn’t have any ideas,” he says. “So I came up with one.” That simple prompt became the start of his career in orbit.

That “something” began with selfies from orbit. Space Aye’s first project, Spelfie, let users at concerts and festivals tag themselves in satellite photos, creating proof they were part of an event visible from space. “We thought of it as experiential engagement,” Newlands says. “People are collecting moments.” Then the pandemic hit, shutting down events but leaving Newlands with a far more powerful idea.

In the silence that followed lockdowns, the broader potential of his patent began to click. He realized the underlying technology—merging user-generated data with real-time satellite imagery—could extend far beyond selfies. It could identify ships, pipelines, vehicles, even people and livestock in live images. That capability, he realized, was the missing connective tissue between Earth observation and AI.

Satellites captured the world. Space Aye wants to show it in motion.

Newlands explains it simply: “Maps put your dot on the image. We identify you in the image. That’s the leap.” The distinction is more than semantic. In his world, a wildfire map wouldn’t just show where the fire is but also which fire trucks are trapped, which streets are open, and which families still need to be evacuated. With radar and infrared imaging, satellites can see through smoke and cloud cover, tracking both the blaze and the responders in real time, a perspective that turns disaster data into life-saving information. “A map tells you nothing about what’s actually happening on the ground,” he says. “Context saves lives.”

Photo Illustration Pleiades Neo © Airbus DS 2023

Photo Illustration: Pleiades Neo © Airbus DS 2023

That same ability to translate images into action applies across nearly every sector that moves people or goods. “The potential value could reach into the trillions,” he says—a figure he notes rivals some of the world’s largest national economies. “It’s bigger than the economy of the United Kingdom.” He talks about tracking containers, identifying oil leaks, even catching pipeline theft in Nigeria, where he notes $10 billion was lost in half a year. The difference, again, comes down to identification. “AI can classify and count. We identify with certainty. That’s the game-changer.”

The Next Leap in AI Isn’t Thought, It’s Vision

Artificial intelligence has learned to predict words. Space Aye wants to teach it to perceive the world. The Large Terrestrial Model (LTM) is Newlands’ answer to the limits of large language models. “LLMs have shelf lives,” he says. “They hallucinate. They inherit bias from human text. The next models will learn from the world itself.”

The idea isn’t his alone. Space Aye’s LTM aims to be that model’s data engine. It merges real-time satellite imagery across multiple spectra, including radar, optical, infrared, and hyperspectral, with identity-rich signals from the Internet of Things. The result is a constantly updating, three-dimensional record of Earth and everything moving on it.

Newlands calls it “ground truth for AI.” Every object, from a truck to a turbine, can emit data confirming what it is and what it’s doing. “Right now, AI looks at thousands of images to guess,” he says. “We can show it the real thing. That certainty can take reliability from five nines to seven nines, or one error in three million decisions. That certainty changes everything.”

The vision is sweeping but practical. A live global model could support autonomous vehicles that see five miles in every direction, drones that avoid collisions mid-flight, and cities that route traffic or power based on what’s physically happening, not what’s inferred. It could also train the next generation of AI to understand cause, effect, and motion, concepts no language model can grasp.

The God’s-Eye Problem

Newlands admits the implications can be uncomfortable. “The weight of the crown rests heavily,” he says. “Space data without checks and balances would terrify me.” He describes what he calls “perceived privacy,” a world where most people already share their location and biometrics through devices but still believe they are invisible. Space Aye, he insists, cannot operate without consent and compliance. The company’s mergers occur behind customer firewalls, governed by General Data Protection Regulation (GDPR) and international law.

He knows critics will still worry. “Power can be abused,” he says. “But I think people will trade a degree of privacy for safety. If your family is in danger and a satellite image can find them, you will want it used.” He compares the shift to how society once viewed CCTV as invasive but now accepts it as protective.

The API for Earth

Space Aye’s business model echoes its vision. Its Large Terrestrial Exchange (LTx) platform acts as a kind of hypermarket for imagery. Corporate customers can request a capture, pay on demand, and merge the image with their clients IoT data. Instead of selling 10 images to one client, the company can sell one image to 10,000. The economics scale the same way cloud computing did two decades ago. “That’s how we democratize space,” Newlands says.

The technology’s impact is already being explored across shipping, energy, insurance, and humanitarian sectors. Governments can use it to verify climate pledges, monitor deforestation, and coordinate disaster relief.

He envisions a future where the data also feeds global accountability. “Corporations will have to prove their impact on nature and people, not just report it. You can’t fake what a satellite sees.”

Photo Illustration Space Aye

Photo Illustration: Space Aye

When the World Starts Watching Back

The next decade, Newlands believes, will belong to systems that understand the planet in real time. He imagines a world where every movement leaves a visible trace, where AI doesn’t just process text but interprets physical change. He calls it the “majority report,” a quiet inversion of the sci-fi dystopia. Replace precogs with data, add quantum computing, and the system can predict floods, stop crimes, and save lives.

Whether that’s utopian or unnerving will depend on who controls the lens. A live digital twin of Earth could become the foundation for climate resilience or the architecture of surveillance. It could save billions or waste billions.

Newlands doesn’t claim to have all the answers. But he’s confident that the digital and physical worlds are already deeply intertwined. “We already live in the matrix,” he says. “The question is whether we can make it humane.”