AIbase
プロダクトライブラリツールナビゲーション

LLM-Inference-on-Android

Public

This project demonstrates how to run Large Language Model \(LLM\) inference locally on Android devices using MediaPipe. It provides a foundation for building applications that leverage the power of LLMs without relying on cloud-based APIs, ensuring privacy and enabling offline functionality.

作成時間2025-03-31T19:52:57
更新時間2025-03-31T21:05:58
0
Stars
0
Stars Increase