← Back to all models

Mixtral MoE 8x22B Instruct

mixtral-8x22b-instruct

8x22b Instruct model. 8x22b is mixture-of-experts open source model by Mistral served by Fireworks.

Available at 3 Providers

Provider Source Input Price ($/1M) Output Price ($/1M) Description Free
vercel vercel Input: $1.20 Output: $1.20 8x22b Instruct model. 8x22b is mixture-of-experts open source model by Mistral served by Fireworks.
fireworksai litellm Input: $1.20 Output: $1.20 Source: fireworks_ai, Context: 65536
openrouter openrouter Input: $2.00 Output: $6.00 Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in English, French, Italian, German, and Spanish See benchmarks on the launch announcement [here](https://mistral.ai/news/mixtral-8x22b/). #moe Context: 65536