The Inside of Your Mobile Devices

Huawei AI Supernode Stuns Tech World With Power Nvidia Didn’t Expect

Is Huawei closing in on Nvidia's dominance? Its AI supernode might outperform the NVL72 in computing speed and capability.

2

Huawei’s new AI supernode gives tough competition to Nvidia

Huawei has introduced a new AI system called CloudMatrix 384 Supernode, which the company says can solve big computing problems just like Nvidia’s top products.

This new system is being compared to Nvidia’s NVL72, which has been leading in the AI chip market. Huawei’s system is designed to make AI work faster and smoother, especially in big data centers.

More power than Nvidia’s NVL72

Nvidia’s NVL72, which came in March 2024, connects 72 GPUs (graphics processors) into one powerful unit using NVLink technology. This setup helps AI models run faster, almost 30 times quicker than older versions.

Huawei’s CloudMatrix 384 Supernode, installed at its data centers in Wuhu City, China, can reach up to 300 petaflops. In comparison, Nvidia’s NVL72 reaches 180 petaflops. (1 petaflop means one quadrillion calculations per second.)

This shows Huawei’s supernode may be stronger in raw computing power.

What is a supernode?

A supernode is a special kind of computer setup that has more resources like CPUs, NPUs, storage, memory, and bandwidth. These help make AI models train and work faster by improving the overall system speed.

If this news is confirmed, then it shows Huawei is getting closer to building strong computing systems on its own, without depending on foreign companies, especially as tech tensions between the US and China continue.

Huawei launched its CloudMatrix platform in September 2024 to meet the growing demand for AI computing. This increase came after the launch of AI models like OpenAI’s GPT.

Huawei working with DeepSeek’s R1 model

Huawei is working with a Chinese startup called SiliconFlow to use this new supernode for DeepSeek’s AI model called R1. DeepSeek is a company based in Hangzhou that launched R1 in January. It became very popular around the world.

Reports say the CloudMatrix 384 Supernode, which runs on Huawei’s own chips instead of Nvidia’s, can process up to 1,920 tokens per second. In AI language, “tokens” are small pieces of text the system reads and understands.

Other companies are also investing in AI

Other Chinese tech companies are also spending big on AI. For example, Alibaba said in February that it will invest 380 billion yuan (around $52.4 billion) in AI computing over the next three years.

At the same time, Nvidia’s CEO Jensen Huang said that AI models are using more and more computing power.

After DeepSeek launched R1 and another model V3, some people thought AI might need less computing power. But Huang disagreed.

“Inference models can need 100 times more computing resources,” he said. Inference is the step where a trained AI model gives answers or makes decisions.

DeepSeek praised, but computing still needed

Huang also praised DeepSeek’s work. He said its R1 model is one of the best and is helping developers all over the world.

But he added that running models like R1 needs top-level chips, which is why companies will still need powerful processors.

Nvidia’s partner DDN, a company that provides data storage systems, also supports this view. Their tools are used in AI, research, and big data systems.

Some investors have asked if all this money spent on AI systems is really necessary, especially if AI training becomes easier.

But Huang says post-training is very important. This is the phase where AI systems improve their ability to solve problems and make smart choices. So the demand for good chips will keep rising.

AI’s future needs more power

Huang said many people think AI only has two parts: training and answering. But he believes the third part—post-training—is where AI actually becomes smarter.

He said the release of DeepSeek’s R1 model has brought new energy into the AI world. “It’s really exciting,” he said.

Cloud companies also told Insider that Nvidia’s chips are still in high demand.

According to analysts, the way AI is used is changing. Inference is becoming more common, and it needs strong systems. Nvidia’s latest chip, Blackwell, is designed for this.

Even though Nvidia still leads the AI chip market, some experts say competition is slowly starting to affect its position.

“Competition is starting to make a difference, but it’s not a big threat yet,” said Lucas Keh from Third Bridge.

ALSO READ: The Display Best For You: Flat vs Curved vs Quad Curved Display

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy