Computer

Mistral Large: here is the new generative model and the Le Chat chatbot to try immediately

Mistral Large: here is the new generative model and the Le Chat chatbot to try immediately

We were among the first to present, at the time, the work carried out by the French startup Mistral AI which, gradually, now aims to establish itself as an alternative to OpenAI, Anthropic (founded by former OpenAI employees), Meta and Google solutions. At least as far as I’m concerned generative models can be integrated into applications that must be able to exploit advanced artificial intelligence features.

At the time of the launch of the Mistral 7B model, we lingered on the fact that it is a tool open source born from the close collaboration with the CINECA/EuroHPC consortium. The creators of Mistral AI, in fact, had publicly thanked the operators of the all-Europen Leonardo supercomputer for allowing them to use its resources for the purposes oftraining of the model.

In short, Mistral AI’s activity was born a little under the wing of Europe. EuroHPC (European High Performance Computing) is a joint technological initiative in the field of high performance computing which represents the cornerstone of the European Union’s industrial strategy in the field of supercomputing and data processing.

Mistral Large, the great challenge of an all-European generative model

The new LLM (Large Language Model) Mistral Largejust unveiled by the Parisian company, is designed to rival other high-level models of the “competition”, such as GPT-4 e Claude 2.

Founded by former DeepMind and Meta members, Mistral AI initially presented itself as a company engaged in artificial intelligence solutions, with a strong focus on the open source aspect. In short, what OpenAI should have been like. It must be said, however, that although the first Mistral AI model is available with a permissive license and the startup has shared access to the model weights, the same cannot be said for the larger models subsequently distributed.

The more time passes, the more business model of Mistral AI increasingly resembles the one chosen by OpenAI. Even the new Mistral Large, for example, can be “interrogated” using aPayment API with consumption prices.

At the moment, developers interested in having their respective applications communicate with Mistral Large should spend 7.30 euros per million input tokens and 22 euros per million output tokens. The number of tokens input and output refers, respectively, to the volume of words or text units sent to Mistral Large and those obtained by the model as a result of processing.

By default, Mistral AI supports context windows of 32,000 tokens (generally more than 20,000 words) and is compatible with languages ​​such as Europen, English, French, Spanish and German.

For comparison, OpenAI GPT-4 with a context window from 32,000 tokens it currently costs $60 per million input tokens; $120 per million output tokens. So, Mistral Large is currently 5 to 7.5 times cheaper than GPT-4-32k.

Try the new chatbot based on Mistral Large: here is Le Chat

Also available in a free version, Le Chat presents itself as a chatbot alternative to ChatGPT and several other rivals which in the meantime have carved out an important market share for themselves.

Previous login your Cat, the Web chatbot uses the new Mistral Large generative model by default, as confirmed by the drop-down menu at the top right. Alternatively, you can choose to use a lighter and more concise model in your answers or Next, which represents a bit of the “experiment laboratory” of Mistral AI. The idea is to put a LLM prototype (Large Language Model) that is ahead of its time and by sacrificing a little “precision” allows for more creative answers.

Le Chat chabot Mistral Large

Mistral Large itself is however referred to as in beta version. It is therefore reasonable to expect some anomalous behavior. The company also plans to launch a paid version of Le Chat for business customers. Corporate customers will also be able to define mechanisms moderation personalized.

From the first visit, Mistral AI informs users that the information sent to the chatbot can be reused to improve the behavior of the generative model.

Mistral AI also announces a collaboration with Microsoft

When it comes to artificial intelligence and generative models, Microsoft is always “tuned”. In addition to API platform of Mistral, Microsoft will provide Mistral models to its customers Azure.

At first glance, the collaboration with Microsoft seems to boil down to the addition of the new Mistral Large model. In reality, Mistral AI and Microsoft are talking with the aim of developing new forms of collaboration or, potentially, other initiatives.

As we have also noted previously, Microsoft is the main investor in OpenAI. But it has also welcomed other AI models under its wing, adding them to its cloud computing platform. In addition to OpenAI and now also Mistral, Microsoft has developed a cooperation with Meta to offer Llama models on Azure.

It goes without saying that the open approach to AI partnerships is a good way for Microsoft to keep users on the Azure platform and in its product ecosystem. It could also be useful in the event of antitrust checks by the competent authorities.

Leave a Reply

Your email address will not be published. Required fields are marked *