Dark | Light
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

![rohanpaul_ai Avatar](https://lunarcrush.com/gi/w:24/cr:twitter::2588345408.png) Rohan Paul [@rohanpaul_ai](/creator/twitter/rohanpaul_ai) on x 73.9K followers
Created: 2025-06-30 03:55:10 UTC

🚨 CHINA’S BIGGEST PUBLIC AI DROP SINCE DEEPSEEK

@Baidu_Inc open source Ernie, XX multimodal MoE variants

πŸ”₯ Surpasses DeepSeek-V3-671B-A37B-Base on XX out of XX benchmarks

πŸ”“ All weights and code released under the commercially friendly Apache XXX license (available on @huggingface )

thinking mode and non-thinking modes available

πŸ“Š The 21B-A3B model beats Qwen3-30B on math and reasoning despite using XX% fewer parameters

🧩 XX released variants range from 0.3B dense to 424B total parameters. Only 47B or 3B stay active params, thanks to mixture-of-experts routing

πŸ”€ A heterogeneous MoE layout sends text and image tokens to separate expert pools while shared experts learn cross-modal links, so the two media strengthen rather than hinder each other

⚑ Intra-node expert parallelism, FP8 precision and fine-grained recomputation lift pre-training to XX% model FLOPs utilization on the biggest run, showing strong hardware efficiency

πŸ–ΌοΈ Vision-language versions add thinking and non-thinking modes, topping MathVista, MMMU and document-chart tasks while keeping strong perception skills

πŸ› οΈ ERNIEKit offers LoRA, DPO, UPO and quantization for fine-tuning, and FastDeploy serves low-bit multi-machine inference with a single command

βš–οΈ Multi-expert parallel collaboration plus 4-bit and 2-bit lossless quantization cut inference cost without hurting accuracy

![](https://pbs.twimg.com/media/GuqZOILXkAEyD77.png)

XXXXXX engagements

![Engagements Line Chart](https://lunarcrush.com/gi/w:600/p:tweet::1939533142445953078/c:line.svg)

**Related Topics**
[coins ai](/topic/coins-ai)

[Post Link](https://x.com/rohanpaul_ai/status/1939533142445953078)

[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]

rohanpaul_ai Avatar Rohan Paul @rohanpaul_ai on x 73.9K followers Created: 2025-06-30 03:55:10 UTC

🚨 CHINA’S BIGGEST PUBLIC AI DROP SINCE DEEPSEEK

@Baidu_Inc open source Ernie, XX multimodal MoE variants

πŸ”₯ Surpasses DeepSeek-V3-671B-A37B-Base on XX out of XX benchmarks

πŸ”“ All weights and code released under the commercially friendly Apache XXX license (available on @huggingface )

thinking mode and non-thinking modes available

πŸ“Š The 21B-A3B model beats Qwen3-30B on math and reasoning despite using XX% fewer parameters

🧩 XX released variants range from 0.3B dense to 424B total parameters. Only 47B or 3B stay active params, thanks to mixture-of-experts routing

πŸ”€ A heterogeneous MoE layout sends text and image tokens to separate expert pools while shared experts learn cross-modal links, so the two media strengthen rather than hinder each other

⚑ Intra-node expert parallelism, FP8 precision and fine-grained recomputation lift pre-training to XX% model FLOPs utilization on the biggest run, showing strong hardware efficiency

πŸ–ΌοΈ Vision-language versions add thinking and non-thinking modes, topping MathVista, MMMU and document-chart tasks while keeping strong perception skills

πŸ› οΈ ERNIEKit offers LoRA, DPO, UPO and quantization for fine-tuning, and FastDeploy serves low-bit multi-machine inference with a single command

βš–οΈ Multi-expert parallel collaboration plus 4-bit and 2-bit lossless quantization cut inference cost without hurting accuracy

XXXXXX engagements

Engagements Line Chart

Related Topics coins ai

Post Link

post/tweet::1939533142445953078
/post/tweet::1939533142445953078