Google’s latest AI model Gemma 3 270M fits on smartphones and barely consumes battery.
While Big Tech has primarily focused on cloud-based LLMs in recent years, Google is now releasing a new compact variant: Gemma 3 270M. It runs locally, for example on your smartphone or even completely in your browser.
270 Million Parameters
Earlier this year, Google released the first open Gemma 3 models, with between 1 and 27 million parameters. The more parameters, the better it performs. With just 270 million parameters, this is the smallest model in the Gemma family. Previous versions had up to 27 billion parameters, but Gemma 3 holds its own.
In the IFEval benchmark, Gemma 3 270M scored better than other lightweight models with more parameters, and achieves well over half the performance of heavier versions.

AI on your Own Device
Running locally means less latency, no data traffic to the cloud, and more privacy. According to Google, the Gemma 3 270M can conduct 25 conversations on a Pixel 9 Pro with barely 0.75 percent battery consumption. This makes the model suitable for applications such as word processing, data processing, or lightweight chatbots.
You can download the model for free via Hugging Face and it’s available in Google’s Vertex AI. Google calls the models ‘open’, meaning developers can modify and use them without a separate license, although terms of use do apply. To showcase the possibilities, Google developed a story generator that you can test yourself in your browser.
