mistralai
Mistral: Mixtral 8x7B Instruct
Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters. Instruct model fine-tuned by Mistral. #moe
Input Cost
$0.54
per 1M tokens
Output Cost
$0.54
per 1M tokens
Context Window
32,768
tokens
Developer ID: mistralai/mixtral-8x7b-instruct
Related Models
mistralai
$2.00/1M
Mistral: Pixtral Large 2411
Pixtral Large is a 124B parameter, open-weight, multimodal model built on top of [Mistral ...
mistralai
$0.40/1M
Mistral: Mistral Medium 3
Mistral Medium 3 is a high-performance enterprise-grade language model designed to deliver...
mistralai
$0.20/1M
Mistral: Saba
Mistral Saba is a 24B-parameter language model specifically designed for the Middle East a...