Phi-3-mini-4k-instruct-onnx

Quantized ONNX model of Phi-3 Mini, supporting acceleration of inference on multiple hardware platforms

PremiumNewProductProgrammingNatural Language ProcessingGenerative AI Model
Phi-3 Mini is a lightweight, advanced open-source large-scale model constructed on synthetic data from Phi-2 and filtered website data, committed to providing high-quality, inference-intensive data. The model has undergone a rigorous enhancement process, combining supervised fine-tuning and direct preference optimization to ensure strict adherence to instructions and robust security measures. This repository provides an optimized ONNX version of Phi-3 Mini, which can be accelerated on CPUs and GPUs using ONNX Runtime, supporting servers, Windows, Linux, Mac, and various platforms, with the best precision configuration tailored for each platform. ONNX Runtime's DirectML support allows developers to achieve large-scale hardware acceleration on Windows devices with AMD, Intel, and NVIDIA GPUs.
Visit

Phi-3-mini-4k-instruct-onnx Visit Over Time

Monthly Visits

17788201

Bounce Rate

44.87%

Page per Visit

5.4

Visit Duration

00:05:32

Phi-3-mini-4k-instruct-onnx Visit Trend

Phi-3-mini-4k-instruct-onnx Visit Geography

Phi-3-mini-4k-instruct-onnx Traffic Sources

Phi-3-mini-4k-instruct-onnx Alternatives