DailyGlimpse

Decentralized AI: Training Frontier Models Without Data Centers

AI
April 30, 2026 · 3:21 PM

In a recent episode of Eye on AI, host Craig Smith interviewed Steffen Cruz, co-founder and CTO of Macrocosmos, about a groundbreaking approach to training artificial intelligence models without relying on massive data centers. Cruz, who transitioned from subatomic physics to decentralized AI, outlined how blockchain-based networks like Bittensor enable distributed computing for AI training.

Traditional AI training requires enormous data centers with thousands of specialized chips, consuming vast amounts of energy. But Cruz argues this model is hitting a wall due to high costs, energy constraints, and centralization risks. Instead, Macrocosmos is building a decentralized network where individuals can contribute computing power from home—turning a Mac Mini into a passive income machine by training AI models.

The system uses a blockchain registry, synchronization clock, and reward mechanism to coordinate distributed training. Contributors are rewarded in cryptocurrency for their computational contributions, effectively creating a global supercomputer from spare parts. Cruz highlighted projects like IOTA that aim to build such a global supercomputer, enabling anyone to participate in AI development.

This approach could democratize AI development, reducing reliance on tech giants and making it accessible to a broader community. However, challenges remain, including network latency, trust, and ensuring model quality across distributed nodes. As data centers become increasingly strained, decentralized AI could offer a sustainable alternative.