Ant Financial's BaiLing large model (Ling) announced today that its trillion-parameter flagship model Ling-2.6-1T is now open-sourced. The model continues the technical core introduced last week, no longer blindly pursuing ultra-long thinking chains or redundant parameter scale, but instead achieving a "Fast-Thinking" mechanism through an innovative MLA and LinearAttention hybrid architecture, aiming to solve the intelligence efficiency problem of trillion-level models in real production flows.

QQ20260430-101501.jpg

Test data shows that Ling-2.6-1T demonstrates remarkable Token Efficiency in the complete evaluation of Artificial Analysis, completing the entire assessment with only 16M tokens, with output costs being about one quarter of similar models. In terms of comprehensive intelligence level, the model has reached the level of GPT-5.4 (non-inference mode); while in practical indicators such as reasoning, code implementation, tool calling, and multi-step task execution, Ling-2.6-1T has achieved the SOTA (industry-leading) level in the open-source field.

QQ20260430-101519.jpg