The war between AI developers has only started. With rumors that there are numerous developing models in China that fight for each other, and apparently ignoring what was achieved by US and European developers so far, the first samples begin to appear that Deepseek is not the most advanced company of the most advanced company of the country in AI, but is only one more. For sample, the model that Alibaba has just presented: a new version of its qwen 2.5 modelwhich ensures that it is more powerful than one of the latest Deepseek models, the V3.
According to Reuters, Alibaba says Qwen 2.5-Maxthis new version, exceeds GPT-4O performance, Deepseek-V3 and call-3.1-405n. It is a MOE type model (Mixture of Experts) that is characterized by an architecture that allows a more efficient scaling than traditional models, activating only a subset of parameters during their inference work. The incorporation of this type of architecture makes Qwen 2.5-Max a very powerful model, without consuming a lot of resources.
As they point out from Alibaba Cloud, they have pressed this model with about 20 billion tokens, which makes it one of the richest models in data today. This also makes possible that its performance is high in various types of tasks, ranging from the understanding of natural language to complex reasoning. In addition to its performance, Qwen 2.5-Max also stands out for its scalability and efficiency.
The possibility that it has, therefore, to manage increasingly complex tasks, it can process large amounts of information quickly and quite precisely. That is why it is useful for tasks such as real -time analytics, task automation or development of certain types of bots.
This model has a maximum entry sequence length of up to 100,000 tokens, much greater than the limits of other models. This allows you to maintain both coherence and relevance in longer conversations or working with more extensive documents.
But as we have mentioned, Deepseek or Alibaba are just some of the Chinese companies they are preparing, or have already launched, their AI models. Shortly after Deepseek presented its R1 model, Bytedance also updated its AI model, and also ensures that it exceeds O1 OPENAI in the Aime test bank, responsible for measuring the understanding and response to complex instructions of the models.
MoNshot, another Chinese startup that advances in multimodal models
Almost at the same time as Bytedance, another startup by China, MANshot, Kimi 1.5 announced, a multimodal reasoning model and reinforced learning with which you have updated your AI assistant, and which is available on your website.
This model competes with models such as GPT-4O or Claude Sonnet 3.5, especially in tasks that require complex reasoning, such as work with text, images and coding. Its context window is even greater than that of Qwen 2.5, with up to 128,000 tokens and can manage tasks that require textual and visual understanding. For example, interpretation of diagrams or tables.
Kimi 1.5 has two versions. The first is Kimi 1.5 Long-Cot, for detailed reasoning. The second is called Kimi 1.5 Short-Cot, and serves to get short and concise answers. In both cases they exceed the performance of OpenAi O1 and Deepseek-R1. In addition, it is compatible with real -time web searches, managing up to searches on a hundred websites and simultaneously analyzing 50 files. Among them PDFs, Word documents, PowerPoint images and presentations.
They have COT type reasoning capacity (Chain of Though or thought chain), which improve problem solving capacity. It also affects a better understanding of images. Kimi 1.5 is also compatible with English, although it still does not have complete support for the language and is in the improvement phase, since among other things both the interface and the compatibility with files are only in Chinese.
During the next weeks it is possible that the launches of new AI models from China follow each otherso in the United States and Europe, companies dedicated to the development of artificial intelligence systems will have to put the batteries if they do not want to fall behind.