Somewhere inside a folder you have probably never opened, Google Chrome may have placed a 4GB file on your computer without asking. It is called weights.bin, and it is a local copy of Gemini Nano, Google's on-device language model. Users who found it typically did so by accident: a storage alert, a routine disk cleanup, a moment of curiosity about why Chrome was using so much space. Chrome did not send a notification. No installation dialog appeared.
Security researcher Alexander Hanff, who goes by "That Privacy Guy," published the most detailed analysis of what is happening. His complaint is not that Gemini Nano is dangerous. It is that software you downloaded to browse the web has started making autonomous decisions about what else to install on your machine, without telling you it is doing so.
The technical story is relatively straightforward. Gemini Nano is the engine behind several Chrome AI features: scam detection in Google Messages, AI-assisted summarisation, and optimization tools. Google has chosen to run the model locally rather than sending data to the cloud, which means user data stays on the device. From a privacy standpoint, local processing is usually considered the better option. The consent problem is separate from the privacy architecture: it is about what you agreed to when you installed Chrome, versus what Chrome subsequently decided to do on your behalf.
The behavior is also persistent in ways that surprised users who tried to remove the file. Deleting weights.bin manually does not work permanently. Chrome treats it as a missing component and re-downloads it. The only reliable removal method requires navigating to Chrome's internal flags page, disabling the "optimization guide on device" setting, and allowing the browser to clean up automatically. Hanff notes this is unusual behavior for software a user never requested: self-repair logic for something that was never, in any meaningful sense, installed by the user in the first place.
Hanff draws a direct line between the Chrome situation and a separate case he investigated involving Anthropic's Claude Desktop application. The Claude Desktop app, he reported, quietly installed browser integration components across multiple Chromium-based browsers on a test system, including five browsers the user did not have installed. The mechanism was different; the pattern was the same: software that arrives explicitly and then expands its footprint without announcement.
The comparison lands on a genuine industry tendency. Microsoft has been deploying AI features to Windows through automatic updates for over a year, with Copilot integrations arriving without distinct installation moments. The implicit theory across these deployments seems to be that AI features are improvements, improvements are good, and users who have consented to software updates have therefore consented to AI features. The argument is technically defensible and intuitively wrong to most users who encounter it.
The contrast with Apple's approach is instructive. Apple announced this week that iOS 27 will give users an explicit choice of which third-party AI models power their devices, with models from Google and Anthropic among the options. The "Extensions" system frames AI as something users select, not something that arrives. If Apple's consent architecture becomes the industry baseline, Google's defaults will look increasingly out of step.
What the Chrome story ultimately reveals is a consent gap that the software industry has not closed. Web browsers, operating systems, and applications are all accumulating AI capabilities through the path of least resistance: update mechanisms that carry new features alongside security patches, with no separate disclosure for components that represent qualitatively different additions to the software. A web browser acquiring a local language model is not the same kind of update as a security patch or a UI change, but it arrives through the same channel and with the same implicit consent.
Users who care can disable the feature. That requires knowing the feature exists, knowing where to find the flag, and accepting that a capability the browser vendor considers beneficial will be gone. Most users will not take those steps. Whether that constitutes informed acceptance of weights.bin on their storage is the question the industry has not answered, and does not appear to be in a hurry to ask.