Codestral
by Mistral
Codestral is Mistral's dedicated code model trained on 80+ programming languages, leading on HumanEval and other code generation benchmarks among openly available models. It's optimized for both code completion in IDEs and full code generation via API.
Send a message to start the conversation.
More from Mistral
View all →Mistral Large 2
Mistral Large 2 is Mistral AI's frontier model with strong reasoning, multilingual capabilities, and a 128K context window. Built in France, it's particularly strong on European languages and compliance-sensitive enterprise use cases.
$2.00 / M tokens
Mistral 7B Instruct
Mistral 7B Instruct is Mistral's compact instruction-following model, widely known for punching well above its weight class on coding and reasoning benchmarks. It's fully open-source under Apache 2.0 and efficient enough to run on a single consumer GPU.
$0.08 / M tokens
Mixtral 8×7B
Mixtral 8×7B is Mistral's sparse MoE model that uses only 13B active parameters per forward pass from a 47B total pool, delivering GPT-3.5-class performance at open-source prices. It's widely used in production RAG and function-calling pipelines.
$0.24 / M tokens