Tutorials

Using MediaPipe LLM Inference API in an Android App

What Is MediaPipe LLM Inference? If you’re building an Android app that needs to run large language models (LLMs) on-device, MediaPipe LLM Inference API is one of the most accessible ways to get there. It handles model loading, quantisation, and hardware acceleration so you can focus on the experience — no cloud dependency, no server […]

, , , , , ,