Google quietly released an app that lets you download and run AI models locally
Google quietly released an app that lets you download and run AI models locally

GitHub - google-ai-edge/gallery: A gallery that showcases on-device ML/GenAI use cases and allows people to try and use models locally.

Why would I use this over Ollama?
Ollama canโt run on Android
That's fair, but I think I'd rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
You can use it in termux
Is there any useful model you can run on a phone?
Llama.cpp (on which ollama runs on) can. And many chat programs for phones can use it.
Try PocketPal instead