Octopus-V2
Octopus-V2-2B is a 2B LLM running on mobile devices with performance exceeding GPT-4.
CommonProductProductivityLarge Language ModelMobile Device Optimized
Developed by Stanford University's NexaAI, Octopus-V2-2B is an open-source large language model with 2 billion parameters, specifically tailored for Android API function calls. It utilizes a unique functional tokenization strategy for both training and inference, achieving performance comparable to GPT-4 while improving inference speed. Octopus-V2-2B is particularly suited for edge computing devices, allowing for direct on-device execution and supporting a wide range of applications.
Octopus-V2 Visit Over Time
Monthly Visits
19075321
Bounce Rate
45.07%
Page per Visit
5.5
Visit Duration
00:05:32