Mistral AI

Mistral AI

Major Contributor
About

French AI company

Portfolio Stats
Total Models16
Multimodal7
Benchmarks Run108
Avg Performance62.2%
Latest Release
Devstral Small 1.1
Released: Jul 11, 2025
Release Timeline
Recent model releases by year
2025
9 models
2024
7 models
Performance Overview
Top models and benchmark performance

Benchmark Categories

reasoning
5
79.8%
roleplay
6
77.3%
vision
10
73.6%
code
23
69.5%
math
11
66.0%

Model Statistics

Multimodal Ratio
44%
Models with Providers
11

All Models

Complete portfolio of 16 models with advanced filtering

LicenseLinks
Mistral AIDevstral Medium
Devstral Medium builds upon the strengths of Devstral Small and takes performance to the next level with a score of 61.6% on SWE-Bench Verified. Devstral Medium is available through the Mistral public API, and offers exceptional performance at a competitive price point, making it an ideal choice for businesses and developers looking for a high-quality, cost-effective model.
Jul 10, 2025
Proprietary
61.6%----
Mistral AIDevstral Small 1.1
Devstral Small 1.1 (also called devstral-small-2507) is based on the Mistral-Small-3.1 foundation model and contains approximately 24 billion parameters. It supports a 128k token context window, which allows it to handle multi-file code inputs and long prompts typical in software engineering workflows. The model is fine-tuned specifically for structured outputs, including XML and function-calling formats. This makes it compatible with agent frameworks such as OpenHands and suitable for tasks like program navigation, multi-step edits, and code search. It is licensed under Apache 2.0 and available for both research and commercial use.
Jul 11, 2025
Apache 2.0
53.6%----
Mistral AIMistral Small 3.2 24B Instruct
Mistral-Small-3.2-24B-Instruct-2506 is a minor update of Mistral-Small-3.1-24B-Instruct-2503.
Jun 20, 2025
Apache 2.0
-----
Mistral AIMagistral Small 2506
Building upon Mistral Small 3.1 (2503), with added reasoning capabilities, undergoing SFT from Magistral Medium traces and RL on top, it's a small, efficient reasoning model with 24B parameters. Magistral Small can be deployed locally, fitting within a single RTX 4090 or a 32GB RAM MacBook once quantized.
Jun 10, 2025
Apache 2.0
---51.3%-
Mistral AIMagistral Medium
Trained solely with reinforcement learning on top of Mistral Medium 3, Magistral Medium is a reasoning model that achieves strong performance on complex math and code tasks without relying on distillation from existing reasoning models. The training uses an RLVR framework with modifications to GRPO, enabling improved reasoning ability and multilingual consistency.
Jun 10, 2025
Apache 2.0
-47.1%-50.3%-
Mistral AIMistral Small 3.1 24B Base
Pretrained base model version of Mistral Small 3.1. Features improved text performance, multimodal understanding, multilingual capabilities, and an expanded 128k token context window compared to Mistral Small 3. Designed for fine-tuning.
Mar 17, 2025
Apache 2.0
-----
Mistral AIMistral Small 3.1 24B Instruct
Building upon Mistral Small 3 (2501), Mistral Small 3.1 (2503) adds state-of-the-art vision understanding and enhances long context capabilities up to 128k tokens without compromising text performance. With 24 billion parameters, this model achieves top-tier capabilities in both text and vision tasks.
Mar 17, 2025
Apache 2.0
--88.4%-74.7%
Mistral AIMistral Small 3 24B Base
Mistral Small 3 is competitive with larger models such as Llama 3.3 70B or Qwen 32B, and is an excellent open replacement for opaque proprietary models like GPT4o-mini. Mistral Small 3 is on par with Llama 3.3 70B instruct, while being more than 3x faster on the same hardware.
Jan 30, 2025
Apache 2.0
----69.6%
Mistral AIMistral Small 3 24B Instruct
Mistral Small 3 is a 24B-parameter LLM licensed under Apache-2.0. It focuses on low-latency, high-efficiency instruction following, maintaining performance comparable to larger models. It provides quick, accurate responses for conversational agents, function calling, and domain-specific fine-tuning. Suitable for local inference when quantized, it rivals models 2–3× its size while using significantly fewer compute resources.
Jan 30, 2025
Apache 2.0
--84.8%--
Mistral AIPixtral Large
A 124B parameter multimodal model built on top of Mistral Large 2, featuring frontier-level image understanding capabilities. Excels at understanding documents, charts, and natural images while maintaining strong text-only performance. Features a 123B multimodal decoder and 1B parameter vision encoder with a 128K context window supporting up to 30 high-resolution images.
Nov 18, 2024
Mistral Research License (MRL) for research; Mistral Commercial License for commercial use
-----
Showing 1 to 10 of 16 models