Google released Gemma 4 this week: frontier-quality, free to download, runs on a phone, Apache 2.0 licence. The open-source community greeted it warmly, as it generally greets these releases. And it is genuinely good news for individuals and small operators who can't afford premium API access. But watching Google and Meta release increasingly capable open weights, I keep thinking about a different story: a tragedy of the commons, playing out in slow motion, built on the ruins of an idealism that was real.
OpenAI was named for a reason. The early AI research culture ran on shared papers, shared weights, open benchmarks, and a genuine belief that broadly distributed capability was safer and more beneficial than concentrated capability. That ethos had deep roots in the free software movement. It shaped a generation of researchers and produced the intellectual foundations of the current boom. Then came the capital requirements, the competitive pressures, and the dawning realisation that the thing being built was genuinely valuable. OpenAI went closed. The commons began to enclose.
What replaced it is something that wears the same clothes but has a different motivation entirely. When Meta releases Llama or Google releases Gemma, they are not acting out of open-source idealism. They are executing a corporate strategy: build an ecosystem around your tools, commoditise your competitors' advantages, shape what the developer community optimises for. The Apache 2.0 licence is real. The ethos is not. The idealists built the road, and someone else is driving the truck.
The truck, in this case, is displacement. A company that wants to replace a human worker with AI has always faced a cost calculation. Model API costs, infrastructure, integration, maintenance: it adds up, and for many roles the numbers have not yet justified the switch. That calculation changes when capable AI becomes free and runs locally. Not cheaper. Free. The per-query cost drops to electricity. At that point, the economic case for keeping a human in a role that AI can adequately perform becomes very hard to sustain. Gemma 4 ranks third among all open models in the world. It handles text at a level that was cutting-edge eighteen months ago. The range of roles this touches is not narrow.
The original open-source AI researchers did not intend this. Many of them thought broad access would distribute power rather than concentrate it, that democratisation of capability would translate into democratisation of economic outcomes. That was a reasonable hope in a world where capability was the scarce resource. It turns out capability was never the real constraint. The constraint was always the economic structure that turns every productivity gain into a headcount reduction rather than shorter hours or a broader sharing of the surplus. Open weights do not change that structure. They accelerate through it, with none of the friction that proprietary pricing provided.
Democratisation of capability is real. A small business in Christchurch or Colombo can now run a model that a well-funded tech company would have paid heavily for two years ago. That matters. But it is not the same as democratisation of economic power, and conflating the two has been one of the more expensive category errors of the past decade. The people most exposed to displacement do not primarily suffer from lack of access to AI tools. They suffer from having their labour made redundant by those tools. Giving them access to the thing replacing them is a description of the problem, not a solution to it.
Gemma 4 is impressive. I will use it. But I think we owe something to the researchers who genuinely believed they were building a commons, to be honest about what their work was ultimately harvested for.