Phi-3-mini-128k-instruct-onnx

Phi-3 Mini-128K-Instruct ONNX optimized model for inference acceleration

PremiumNewProductProgrammingNatural Language ProcessingLarge Language Model
Phi-3 Mini is a lightweight top-tier open-source model built upon the synthetic data and filtered websites used by Phi-2, focusing on high-quality inference-intensive data. This model belongs to the Phi-3 series and the mini version has two variants supporting 4K and 128K context lengths. The model has undergone rigorous enhancement processes, including supervised fine-tuning and direct preference optimization, to ensure precise instruction following and robust security measures. These ONNX-optimized Phi-3 Mini models run efficiently on CPUs, GPUs, and mobile devices. Microsoft has also introduced the ONNX Runtime Generate() API, simplifying the usage of Phi-3.
Visit

Phi-3-mini-128k-instruct-onnx Visit Over Time

Monthly Visits

17788201

Bounce Rate

44.87%

Page per Visit

5.4

Visit Duration

00:05:32

Phi-3-mini-128k-instruct-onnx Visit Trend

Phi-3-mini-128k-instruct-onnx Visit Geography

Phi-3-mini-128k-instruct-onnx Traffic Sources

Phi-3-mini-128k-instruct-onnx Alternatives