AIbase
Product LibraryTool Navigation

LLM-Inference-on-Android

Public

This project demonstrates how to run Large Language Model \(LLM\) inference locally on Android devices using MediaPipe. It provides a foundation for building applications that leverage the power of LLMs without relying on cloud-based APIs, ensuring privacy and enabling offline functionality.

Creat2025-03-31T19:52:57
Update2025-03-31T21:05:58
0
Stars
0
Stars Increase