← Back to all models

mixtral-8x7b-instruct

mixtral-8x7b-instruct

Source: perplexity, Context: 4096

Available at 3 Providers

Provider Source Input Price ($/1M) Output Price ($/1M) Description Free
perplexity litellm Input: $0.07 Output: $0.28 Source: perplexity, Context: 4096
fireworksai litellm Input: $0.50 Output: $0.50 Source: fireworks_ai, Context: 32768
openrouter openrouter Input: $0.54 Output: $0.54 Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe Context: 32768