Baidu has officially open-sourced its ERNIE 4.5 series, releasing 10 models on Hugging Face, GitHub and its own PaddlePaddle ecosystem. The lineup includes large-scale MoE (mixture of experts) models with activated parameter sizes of 47 billion and 3 billion (total parameters up to 424 billion), plus smaller dense models at 0.3 billion parameters. A key feature is a multimodal heterogeneous MoE design that shares parameters across modalities while reserving dedicated spaces, aimed at enhancing tasks like vision-language reasoning without sacrificing text performance. The models were trained and optimized using PaddlePaddle, with Baidu reporting MFU (model FLOPs utilization) of up to 47%. Weights are released under Apache 2.0, targeting research and commercial use. Supporting tools like ERNIEKit and FastDeploy simplify fine-tuning and multi-hardware deployment. [Baidu, in Chinese]
Related