[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]  AsianFin 亞財社 [@AsianFinPress](/creator/twitter/AsianFinPress) on x 1659 followers Created: 2025-07-25 22:37:01 UTC 🧠💥 Intellifusion Doubles Down on Inference AI Chips, Eyes 2026 Launch of Next-Gen Architecture China’s Intellifusion is going all-in on AI inference, unveiling its latest X6000 Mesh accelerator (256 TOPS) and servers offering up to X PFLOPS, just ahead of the 2025 World Artificial Intelligence Conference. 🚀 CEO Chen Ning says 2025 is the tipping point: “Inference will outpace training in growth and application.” 🔧 What’s New: Self-developed 4th-gen NPU Fully domestic D2D chiplet + mesh architecture Real-time decoding of XXX video streams Edge & data center servers (Shenmu 6203, Tianzhou 6408/680G) 📈 By the Numbers: 2024 revenue: ¥900M (↑81.3% YoY) Q1 2025 revenue: ¥264M (↑168.2% YoY) 3-year compute deal worth ¥1.6B underway 📱 On the consumer side: Qiancheng AI tech adopted by Huawei, OPPO, Honor “Dr. Luka” hardware sees rapid sales Projected 50%+ consumer growth in H1 2025 🔮 Coming 2026: “Computing Power Building Blocks 2.0” — featuring Nova500 NPU (FP8/FP4 native), 3D hybrid-bonded memory, NB-Mesh interconnect, and NB-Link CPU-NPU shared memory. 🗣️ “Inference is China’s shot at reshaping the Fourth Industrial Revolution,” says Chen. “Invention means little without scale — that’s where we win.” #AIChips #InferenceAI #ChinaTech #Intellifusion #Semiconductors #AIAcceleration #EdgeAI #HPC #AIinfrastructure  XX engagements  **Related Topics** [tops](/topic/tops) [coins ai](/topic/coins-ai) [inference](/topic/inference) [$ai4](/topic/$ai4) [Post Link](https://x.com/AsianFinPress/status/1948875162184810964)
[GUEST ACCESS MODE: Data is scrambled or limited to provide examples. Make requests using your API key to unlock full data. Check https://lunarcrush.ai/auth for authentication information.]
AsianFin 亞財社 @AsianFinPress on x 1659 followers
Created: 2025-07-25 22:37:01 UTC
🧠💥 Intellifusion Doubles Down on Inference AI Chips, Eyes 2026 Launch of Next-Gen Architecture
China’s Intellifusion is going all-in on AI inference, unveiling its latest X6000 Mesh accelerator (256 TOPS) and servers offering up to X PFLOPS, just ahead of the 2025 World Artificial Intelligence Conference.
🚀 CEO Chen Ning says 2025 is the tipping point:
“Inference will outpace training in growth and application.”
🔧 What’s New:
Self-developed 4th-gen NPU
Fully domestic D2D chiplet + mesh architecture
Real-time decoding of XXX video streams
Edge & data center servers (Shenmu 6203, Tianzhou 6408/680G)
📈 By the Numbers:
2024 revenue: ¥900M (↑81.3% YoY)
Q1 2025 revenue: ¥264M (↑168.2% YoY)
3-year compute deal worth ¥1.6B underway
📱 On the consumer side:
Qiancheng AI tech adopted by Huawei, OPPO, Honor
“Dr. Luka” hardware sees rapid sales
Projected 50%+ consumer growth in H1 2025
🔮 Coming 2026: “Computing Power Building Blocks 2.0” — featuring Nova500 NPU (FP8/FP4 native), 3D hybrid-bonded memory, NB-Mesh interconnect, and NB-Link CPU-NPU shared memory. 🗣️ “Inference is China’s shot at reshaping the Fourth Industrial Revolution,” says Chen. “Invention means little without scale — that’s where we win.”
#AIChips #InferenceAI #ChinaTech #Intellifusion #Semiconductors #AIAcceleration #EdgeAI #HPC #AIinfrastructure
XX engagements
/post/tweet::1948875162184810964