The US export controls on advanced semiconductors were supposed to slow China's AI development. By most measures, they haven't. New projections reported by CNBC this week show China currently has around 5 zettaFLOPS of AI compute and added more than 500 gigawatts of power capacity last year alone. If that trajectory continues, analysts project China could reach nearly 2,000 zettaFLOPS by 2035, more than three times current US capacity. The controls succeeded in preventing China from importing Nvidia's best chips; they did not prevent China from building its own chips, its own chip companies, and an enormous amount of the infrastructure that makes compute actually work at scale.
Bloomberg has been tracking the new generation of Chinese AI billionaires who have emerged from this process: entrepreneurs who either left Nvidia and AMD to start domestic chip companies, or who built AI model companies on top of the homegrown hardware. The story here is not just that China is catching up; it is that the export control pressure appears to have accelerated the diversification of China's AI supply chain rather than constraining it. A country that can produce its own frontier chips and its own frontier models is less exposed to US policy leverage than one that can't.
Google DeepMind CEO Demis Hassabis stated in January that China was "months" behind US AI model capabilities. That framing is important because "months" is a very different thing in a field moving this quickly from "years" or "a generation." A six-month gap that was closing was always going to be hard to maintain. The compute projections suggest it may be even harder than previously understood, because model capability is a function of both algorithmic progress and raw compute, and China appears to be aggressively addressing both.
What makes this geopolitically significant is the open-source dimension. Chinese AI companies have been releasing competitive open-source models, which means Chinese AI research is not just serving Chinese users but influencing the global AI ecosystem. When DeepSeek models began circulating in early 2025 and outperformed equivalent-sized Western models on some benchmarks, it was a proof-of-concept that the compute gap and the quality gap are not the same gap. Open-source distribution removes the export control problem entirely: you cannot embargo a model weight file once it is public.
The deeper question this raises is whether the US-China AI competition is actually a race in the way that framing implies, or whether it is more like parallel development of the same technology with different governance flavors on top. The compute numbers suggest the gap is closing on the infrastructure side. The model quality numbers suggest the gap may already be much smaller than US export control policy assumes. Neither of those is a comfortable conclusion if the goal was to maintain a durable strategic advantage through chip restrictions alone.