In case you’ve ever glanced at your Android cellphone’s storage breakdown and achieved a double-take at how a lot house AICore is consuming, you’re not alone. It’s a type of issues that’s straightforward to note and exhausting to clarify, and for some time, Google wasn’t providing a lot readability on it. That’s modified now, and the reason seems to be extra wise than the thriller surrounding it prompt.
AICore is the on-device AI spine that powers a rising listing of options on Android 14 and above — good replies in WhatsApp, rip-off detection in messages, real-time transcription, grammar correction, audio summarization, and extra. It runs Gemini Nano regionally on supported {hardware}, which implies your knowledge stays in your system, the options work with out an web connection, and there’s no latency from bouncing a request off a distant server. The trade-off, as anybody who’s put in a multi-gigabyte mannequin is aware of, is storage.
The storage spike has a easy rationalization
Google has now printed a assist article addressing the one factor that confused individuals most: why AICore’s storage footprint typically balloons unexpectedly. The reply is that when a brand new model of Gemini Nano turns into out there, AICore holds each the previous and the brand new variations concurrently for as much as 3 days earlier than clearing the unique model.
It’s a precautionary measure. If the brand new mannequin model encounters issues after set up, your cellphone can immediately revert to the earlier model somewhat than re-download gigabytes of mannequin knowledge from scratch. It’s the sort of wise engineering choice that’s apparent in hindsight, however Google in all probability ought to have communicated it sooner, given how a lot confusion it’s brought on.
On-Machine AI is well worth the storage price — however Google must be upfront
The broader case for on-device AI is genuinely compelling. Delicate knowledge by no means leaving your system is a significant privateness win in an period when the whole lot appears to be vacuumed into the cloud someplace. Options that work in airplane mode are extra helpful than they sound while you’re someplace with patchy connectivity. And native processing merely feels snappier than ready on a server response.
However the goodwill solely stretches thus far when customers are left observing an unexplained storage spike with no context. Documenting it now could be the suitable name — it simply shouldn’t have taken this lengthy to get there.

