Singapore’s national AI program has moved its Sea-Lion large language model off Meta’s model family and adopted Alibaba Cloud’s Qwen architecture, according to information cited by foreign media from AI Singapore (AISG). The latest version, Qwen-Sea-Lion-v4, was trained with technical support from Alibaba Cloud and built on the Qwen3-32B foundation model, which covers 119 languages and dialects and was trained on 36 trillion tokens. Alibaba Cloud said the model received an additional 100 billion Southeast Asian language tokens for this collaboration, while AISG contributed regional datasets and handled evaluation.
Qwen-Sea-Lion-v4 currently ranks first among open-source models under 200 billion parameters in the “South-east Asian Holistic Evaluation of Language Models.” The model is available on the AI Singapore website and Hugging Face. Singapore launched its national multimodal model program in December 2023 with SGD 70 million (USD 51 million) in funding. [TechNode reporting]
Related
