Mistral AI

mistral.ai
Platform Stats
Total Models11
Organizations1
Verified Benchmarks0
Multimodal Models3
Pricing Overview
Avg Input (per 1M)$0.50
Avg Output (per 1M)$1.50
Cheapest Model$0.10
Premium Model$2.00
Supported Features
Number of models supporting each feature
web Search
0
function Calling
11
structured Output
11
code Execution
0
batch Inference
11
finetuning
0
Input Modalities
Models supporting different input types
text
11 (100%)
image
3 (27%)
audio
0 (0%)
video
0 (0%)
Models Overview
Top performers and pricing distribution

Pricing Distribution

Input pricing per 1M tokens
$0-1
9 models
$1-5
2 models

Top Performing Models

By benchmark avg
#1Mistral Large 2
87.6%
#2Pixtral Large
80.5%
#3Mistral Small 3 24B Instruct
71.7%
#4Pixtral-12B
66.8%
#5Codestral-22B
65.9%

Most Affordable Models

Devstral Small 1.1
$0.10/1M
Mistral Small 3.1 24B Base
$0.10/1M
Mistral Small 3 24B Instruct
$0.10/1M

Available Models

11 models available through Mistral AI

LicenseLinks
Mistral AIDevstral Medium
Devstral Medium builds upon the strengths of Devstral Small and takes performance to the next level with a score of 61.6% on SWE-Bench Verified. Devstral Medium is available through the Mistral public API, and offers exceptional performance at a competitive price point, making it an ideal choice for businesses and developers looking for a high-quality, cost-effective model.
Jul 10, 2025
Proprietary
61.6%----
Mistral AIDevstral Small 1.1
Devstral Small 1.1 (also called devstral-small-2507) is based on the Mistral-Small-3.1 foundation model and contains approximately 24 billion parameters. It supports a 128k token context window, which allows it to handle multi-file code inputs and long prompts typical in software engineering workflows. The model is fine-tuned specifically for structured outputs, including XML and function-calling formats. This makes it compatible with agent frameworks such as OpenHands and suitable for tasks like program navigation, multi-step edits, and code search. It is licensed under Apache 2.0 and available for both research and commercial use.
Jul 11, 2025
Apache 2.0
53.6%----
Mistral AIMistral Small 3.1 24B Base
Pretrained base model version of Mistral Small 3.1. Features improved text performance, multimodal understanding, multilingual capabilities, and an expanded 128k token context window compared to Mistral Small 3. Designed for fine-tuning.
Mar 17, 2025
Apache 2.0
-----
Mistral AIMistral Small 3 24B Instruct
Mistral Small 3 is a 24B-parameter LLM licensed under Apache-2.0. It focuses on low-latency, high-efficiency instruction following, maintaining performance comparable to larger models. It provides quick, accurate responses for conversational agents, function calling, and domain-specific fine-tuning. Suitable for local inference when quantized, it rivals models 2–3× its size while using significantly fewer compute resources.
Jan 30, 2025
Apache 2.0
--84.8%--
Mistral AIMinistral 8B Instruct
The Ministral-8B-Instruct-2410 is an instruct fine-tuned model for local intelligence, on-device computing, and at-the-edge use cases, significantly outperforming existing models of similar size.
Oct 16, 2024
Mistral Research License
--34.8%--
Mistral AIPixtral-12B
A 12B parameter multimodal model with a 400M parameter vision encoder, capable of understanding both natural images and documents. Excels at multimodal tasks while maintaining strong text-only performance. Supports variable image sizes and multiple images in context.
Sep 17, 2024
Apache 2.0
--72.0%--
Mistral AIMistral NeMo Instruct
A state-of-the-art 12B multilingual model with a 128k context window, designed for global applications and strong in multiple languages.
Jul 18, 2024
Apache 2.0
-----
Mistral AIMistral Small
An enterprise-grade 22B parameter model optimized for tasks like translation, summarization, and sentiment analysis. Offers significant improvements in human alignment, reasoning capabilities, and code generation compared to previous versions.
Sep 17, 2024
Mistral Research License
-----
Mistral AICodestral-22B
A 22B parameter code generation model trained on 80+ programming languages including Python, Java, C, C++, JavaScript, and Bash. Supports both instruction-following and fill-in-the-middle (FIM) capabilities for code completion and generation tasks.
May 29, 2024
MNPL-0.1
--81.1%-78.2%
Mistral AIPixtral Large
A 124B parameter multimodal model built on top of Mistral Large 2, featuring frontier-level image understanding capabilities. Excels at understanding documents, charts, and natural images while maintaining strong text-only performance. Features a 123B multimodal decoder and 1B parameter vision encoder with a 128K context window supporting up to 30 high-resolution images.
Nov 18, 2024
Mistral Research License (MRL) for research; Mistral Commercial License for commercial use
-----
Showing 1 to 10 of 11 models