In recent months, artificial intelligence has been on everyone’s lips, as has NVIDIA, since thanks to the development of chips specifically designed for it, it has become the most valuable company in the world. Of course, they continue to work to continue to be the spearhead of this technology, but at what cost? Today, we learned that its next superchiphe NVIDIA GB200 graphics card which integrates both CPU and GPU, will have a consumption of 2,700 wattsa real scandal.
AI has managed to make a place for itself and integrate itself into our daily lives in different areas, and companies are all stepping on the accelerator to try to jump on the bandwagon and claim their share of the pie. But what we don’t see is that for AI to work, it needs huge server farms like the one Elon Musk showed us a few weeks ago, with no less than 100,000 NVIDIA H100 graphics, a server farm that it practically needs. a power plant of its own…
NVIDIA’s Next Superchip’s Power Consumption Is Exaggerated
Today, the well-known company TrendForce has published a new report in which it claims that the energy consumption of AI servers is increasing at the same rate as their power, and that these values are increasing practically day by day. Moreover, the report specifically highlights NVIDIA’s next-generation Blackwell architecture, which will be led by the GB200 superchip.
In theory, this NVIDIA Blackwell GB200 superchip will not be officially launched until next year and will replace the current Hopper platform, becoming the company’s main high-end product.
We are looking at a full-fledged “superchip” with overwhelming power, but the issue of consumption is starting to worry a lot. To put this in context, the power consumption of a single Blackwell B200 chip would reach 1,000 watts, but the GB200 superchip that combines a Grace CPU with two Blackwell GPUs reaches a terrifying level. 2,700 watts of consumption.
If we look back a bit, we can see that NVIDIA’s most famous AI product so far, the H200, has a maximum consumption of 700 watts, while other products like the NVIDIA H20 only require 400 W. On the other hand, the previous G200 superchip (Grace + Hopper) has a consumption of 1,000 watts, which, as we mentioned earlier, is the consumption of a single Blackwell B200 chip.
In other words, NVIDIA AI chips are getting out of control. They are increasing performance through brute force with no regard for efficiency at all, or so it seems, and yes, they are the most powerful AI chips in the world, but at a brutal energy cost.
Now, putting all this data into perspective, imagine what an AI server farm with thousands of these superchips will need. It’s not just about their power consumption anymore, but the very advanced liquid cooling solutions that will be needed to run them, and so often people joke that it takes a nuclear reactor to keep these things running… the end It’s going to end up being something literal.