
"With this launch, the French AI startup is attempting to gain ground on larger competitors such as Anthropic and other specialized coding LLMs. Its not-so-secret weapon is an emphasis on following business context for optimal AI-generated code. In addition to the new Devstral model, there is also the playfully named Mistral Vibe, a command-line interface for code automation. Mistral wants to catch up with the AI leaders, reports TechCrunch, which was the first to report on the new model."
"Mistral is also responding to the vibe coding trend with Mistral Vibe, a CLI tool for searching code in natural language, versioning, or executing commands. It integrates with Zed as an IDe extension. Context awareness at the core Mistral AI focuses on context awareness, which is particularly relevant for business applications. Like the AI assistant Le Chat, which remembers previous conversations and uses them for answers, Vibe CLI has persistent history. It also scans file structures and Git statuses to build context for behavioral information."
"Devstral 2 does have substantial system requirements, however. It requires at least four (!) H100 GPUs or the equivalent and has 123 billion parameters. The more compact and less capable DevStral Small has 24 billion parameters. That is small enough to run on relatively conventional workstations without too many problems. The licenses differ per model. Devstral 2 is covered by a modified MIT license, while Devstral Small uses Apache 2.0."
Mistral launched Devstral 2 alongside a CLI called Mistral Vibe to target programmers and compete with established coding models. Devstral 2 focuses on following business context to produce optimal AI-generated code and integrates with developer workflows via Vibe, which supports natural-language code search, versioning, execution, persistent history, and file/Git scanning. Devstral 2 has 123 billion parameters and requires at least four H100 GPUs; DevStral Small has 24 billion parameters and can run on conventional workstations. Licenses differ by model (modified MIT for Devstral 2, Apache 2.0 for Small). API pricing varies by model after the free period.
Read at Techzine Global
Unable to calculate read time
Collection
[
|
...
]