Mistral AI has announced the launch of its largest AI language model so far: Mistral Large. This is a model whose reasoning capabilities can be used in tasks that require complex reasoning, and in multiple languages. Among them text comprehensions, writing transformation and code generation.
Mistral Large has various improvements compared to other company models, as well as advanced functions. He is proficient in output in several languages. Among them, in English, French, Spanish, German and Italian. This ability is not only reduced to the understanding of words and texts, since this model has a deep level of understanding of grammars and cultural particularities.
In this way, Mistral Large can offer more accurate and context-aware translations and interactions. Another notable improvement of this Mistral model compared to others is its extended context window, with 32,000 tokens. This is a noticeable increase in the amount of text it is able to process at once, allowing it to extract accurate information from fairly long documents.
The model also achieves notable results in following instructions, and has built-in functions for developers to generate custom moderation policies, for example. That is why it can be used, among other things, in content moderation on platforms such as its chat interface, Le Chat, which has also been unveiled within the framework of the Mobile World Congress.
This conversational chat allows users to interact with several Mistral AI models: Large, Small (optimized for latency and cost) and Next, a prototype designed to be short and concise. It has a version for companies, Le Chat Enterprise, with auto-deployment capabilities and improved moderation mechanisms.
Mistral Large’s capabilities can be especially appreciated in application development, as it allows the generation of interactive, personalized software solutions with a certain level of sophistication, both for companies and for consumers. In addition, its Json format makes the model output valid Json, which allows developers to extract information from the model in a structured format, which can be easily integrated into the applications they are developing.
The Mistral Large model will be available through its own platform, Le Plateforme, in addition to being able to use and deploy it locally in certain cases, or via Azure AI Studio and Azure Machine Learning. Microsoft is, with Azure, the first reseller client for Mistral Largethanks to an agreement that both companies have just signed and that has also been announced at the MWC.