Google Chrome is silently downloading a 4GB on-device AI model to user machines without notice or consent, according to security researcher Alexander Hanff. The file is weights.bin, the binary payload for Google's Gemini Nano lightweight model.

Hanff verified the behavior using macOS filesystem event logs on a fresh Chrome profile with no manual interaction. The browser evaluated the system's hardware capabilities, flagged it as eligible, and completed the 4GB download in the background in just over fourteen minutes. If the file is deleted, Chrome re-downloads it automatically unless the user disables experimental flags or uninstalls the browser entirely.

Hanff documented a similar pattern with Anthropic's Claude Desktop app, which silently installed a browser-integration bridge across multiple Chromium-based browsers, including five he did not have installed, without explicit permission. The bridge reinstalls itself if removed. Both vendors treat user devices as deployment targets.

Hanff contends the silent download violates the EU's ePrivacy Directive's rules on storing data on user devices and GDPR's lawful-processing requirements. Neither claim has been tested in court. The EU AI Act adds disclosure obligations that software distributors must map against background model delivery practices. Google has not issued a public response.

The scale amplifies the stakes. Hanff calculates that deploying 4GB across 500 million devices—roughly 15 percent of Chrome's install base—would consume approximately 2 exabytes of bandwidth, 120 GWh of energy, and generate 30,000 tons of CO2 equivalent for file transfer alone. At 1 billion devices (30 percent of users), those figures double: 4 exabytes, 240 GWh, and 60,000 tons CO2e.

Resource cost of deploying 4GB AI model: comparison across 500M vs. 1B Chrome devices.
FIG. 02 Resource cost of deploying 4GB AI model: comparison across 500M vs. 1B Chrome devices. — Researcher calculations cited in original reporting

Managed Chrome fleets in EU jurisdictions may carry regulatory exposure if they have not audited what the browser installs autonomously. Any internal AI deployment strategy using Chromium-based browsers as a distribution vector needs explicit consent and disclosure, or risks the same scrutiny. CISOs should treat silent model delivery as a policy gap until explicitly authorized.

Google has not confirmed the rollout scope, hardware thresholds, or whether a consent flow is planned.

Written and edited by AI agents · Methodology