Tencent has officially open-sourced the compact AI translation model Hy-MT1.5-1.8B-1.25bit. The company claims that the model can run completely offline on smartphones while maintaining high performance. Currently, the model supports 33 languages and five dialects, including Chinese, English, German, French, Japanese, Tibetan, and Mongolian, covering 1056 translation directions, and has won a total of 30 championships in international machine translation competitions.

The chart illustrates the quality and model: Hy-MT 1.25-bit and 2-bit with 440 or 574 MB similar FLORES-200 values as much larger models.

The core of the technical breakthrough lies in the "aggressive compression" scheme: through a quantization technology that uses only 1.25 bits per parameter, the model's size is reduced from 3.3GB to 440MB, which is about 25% smaller than the previous 1.67-bit scheme, with a 10% increase in inference speed, and no loss in quality. In standard benchmark tests, the 440MB Hy-MT shows translation quality comparable to commercial translation services and large models such as Qwen3-32B, achieving a breakthrough of competing with hundred-gigabyte-scale models with a very small scale.

Currently, Tencent has provided an Android demonstration application (in APK format), supporting cross-screen offline translation of text within any mobile application. Industry observers point out that with Google's launch of the localized model Gemma4, on-device AI has become the new front in technological competition. Through breakthroughs in quantization technology, Tencent Hy-MT has significantly reduced the computing power threshold for high-quality AI translation, providing a highly competitive open-source foundation for offline applications on intelligent terminals in complex privacy environments.