LLM-Inference-on-Android
PublicThis project demonstrates how to run Large Language Model \(LLM\) inference locally on Android devices using MediaPipe. It provides a foundation for building applications that leverage the power of LLMs without relying on cloud-based APIs, ensuring privacy and enabling offline functionality.