When AI startups, research labs and even Fortune 500 companies compete for the same limited supply of GPUs, it becomes clear that the next race in AI is not about algorithms but about compute access. The chips that power artificial intelligence have become the most sought-after resource in the digital economy. According to Reuters, orders for Nvidia’s newest Blackwell GPUs already exceed 3.6 million units, driven largely by major cloud service providers. What used to be a competition of ideas is now a competition for compute.
If AI is the new electricity, then compute is the grid that powers it. That grid is largely controlled by a handful of corporations that decide who gets access to GPUs, how much they cost and which projects move forward. For smaller startups and public institutions, the price of entry is becoming almost impossible to afford.
The concentration of computing power has become one of the defining economic and political questions in technology. At the center of it is Nvidia, whose dominance over the global GPU market has reshaped how and where AI progress happens.
In response, a growing number of researchers and builders are rethinking how computing power itself should be owned and shared. Among them are David and Daniil Liberman, who are developing Gonka, a community-governed network that allows participants to contribute and exchange computing power. Their goal is to make AI access as open as the early internet, not a gated service run by a few dominant players.
Nvidia’s dominance has never been greater. The company controls about 94 percent of the GPU market and has become the invisible infrastructure behind nearly every modern AI system.
Nvidia also recently announced that just two ‘unnamed’ direct customers accounted for 39 percent of its quarterly revenue. Most of its Blackwell GPUs go to big AI players, like OpenAI, xAI, Meta, Google, Microsoft and Amazon, David and Daniil Liberman said.
Such concentration is more than a business concern. It shapes who gets to innovate, how fast costs fall and which countries hold leverage in the coming AI economy. As Reuters recently reported, the global rush for AI chips has tightened supplies and raised costs across the semiconductor market, making it harder for smaller firms to compete.
The result is a growing dependence on a few cloud and hardware giants that can afford to secure massive GPU inventories. In practice, the global compute grid that powers AI is now largely in private hands.
The idea of compute as shared infrastructure is gaining momentum.
In an efficient market, every product tends toward commoditization, driving margins down and pushing prices toward the lowest sustainable level. For this to happen with AI, we can draw inspiration from Bitcoin. Not as a financial asset, but as a blueprint for building massive, decentralized infrastructure, the Libermans said.

Their analogy is striking.
Today, Bitcoin miners collectively operate 26 gigawatts of data centers, more than Microsoft, Google and Amazon have built over decades. Meanwhile, advances in Bitcoin mining hardware have reduced the cost of compute, both in dollars and energy, by hundreds of thousands of times. If we can achieve the same transformation for AI compute, we’ll make it truly affordable and accessible to every person on Earth.
A 2025 study by Galaxy Research found that decentralized networks can, in specific workloads, outperform centralized clouds — but warned that verification and reliability remain steep hurdles. Researchers at EPOCH AI, a nonprofit research group, described this as “the paradox of distributed systems”: The more open they become, the more coordination they demand. Without rigorous verification and incentives aligned with performance, community-run networks risk slipping into inefficiency or manipulation.
The challenge, therefore, is whether such decentralized networks can scale without reproducing the same bottlenecks they hope to eliminate.
History shows that decentralized systems can slowly recentralize, as power tends to pool where capital and capacity accumulate. When I asked about the risk of these systems repeating the same concentration patterns they aim to fix, the Libermans acknowledged that even decentralized systems can unintentionally favor the biggest players.
No one can unilaterally change the rules of Bitcoin or Ethereum; any changes require wide consensus,” they said. “You’re right that some design rules give advantages to pools, which lead to concentration of power — so when we built the Gonka protocol, we intentionally avoided features like delegation.
Gonka was originally incubated by Product Science, the U.S. software company founded by the Liberman siblings, but control of the network has since shifted to its users. The structure reflects their broader goal of ensuring that no single entity — founder, investor, or company — can unilaterally decide how the network evolves.
Still, experts caution that technology alone won’t solve governance risks. As the Pew Research Center noted, “technology is neither inherently helpful nor harmful. It is simply a tool… The real effects of technology depend upon how it is wielded.” In other words, even the most elegantly decentralized systems rely on the social and institutional choices that surround them.
Compute access has moved beyond being a technical topic into the delicately intense world of geopolitics. Governments in Europe, Asia and Africa are now linking AI sovereignty with control over computing resources.
We spoke with government officials from four countries. They increasingly see decentralization as the only viable way to safeguard their sovereignty in the face of growing dependence on global AI infrastructure, David said. Their concern isn’t about control itself; it’s about the dominance of the U.S. and China, which can effectively cut them off from the prosperity that AI will bring. Decentralization is the only way to ensure their citizens have equal access to the full benefits of AI.
Around the world, policymakers are beginning to treat compute as critical infrastructure, not just an industrial asset. According to the World Economic Forum, several regional alliances are beginning to build shared digital and compute hubs to strengthen local capacity and reduce dependence on U.S. and Chinese providers. Meanwhile, Coinfomania found that nearly half of U.S. workers now use AI tools daily, highlighting how access to computing power now influences productivity and economic inclusion.
Asked where the balance might go next, the Libermans offered two possibilities.
We see two possible futures, the Libermans said.
In one scenario, a few large labs in the U.S. and China control most of the world’s AI capacity. In the other, open networks like Gonka spark a new wave of hardware innovation that makes computing thousands of times cheaper and more evenly available everywhere.
Big cloud companies will still matter in that world, but they won’t be able to charge such high premiums for access, they noted.
In the end, the race for compute is really a race to make intelligence affordable and accessible. The players who figure out how to do that — either by democratizing access or enabling people to help govern it — could define the next decade.