Octopus-V2
Octopus-V2-2B is a 2B LLM running on mobile devices with performance exceeding GPT-4.
CommonProductProductivityLarge Language ModelMobile Device Optimized
Developed by Stanford University's NexaAI, Octopus-V2-2B is an open-source large language model with 2 billion parameters, specifically tailored for Android API function calls. It utilizes a unique functional tokenization strategy for both training and inference, achieving performance comparable to GPT-4 while improving inference speed. Octopus-V2-2B is particularly suited for edge computing devices, allowing for direct on-device execution and supporting a wide range of applications.
Octopus-V2 Visit Over Time
Monthly Visits
20899836
Bounce Rate
46.04%
Page per Visit
5.2
Visit Duration
00:04:57