The rumor mill has been predicting for some time that the upcoming graphics cards from AMD (RX 7000) and Nvidia (RTX 4000) will have a significantly higher power consumption than their predecessors. AMD senior vice president Sam Naffziger confirms this in an interview not only, but he also assumes that the development will continue in the coming years.
The background is the slowed change to more efficient manufacturing processes for graphics chips. You can counteract this to some extent by optimizing the architecture, which AMD also does according to Naffziger. According to him, this does not change the core of the problem:
Rather, the demand for gaming and computing power is accelerating, and at the same time, the underlying process technology — and its rate of improvement — is slowing down quite dramatically. The power consumption will therefore simply continue to increase. While we have a multi-year roadmap of very significant efficiency gains to flatten this curve, the trend is there.
At the same time, AMD sees itself in a better starting position compared to Nvidia. In order not to be left behind in the battle for the performance crown, the big, green (and not mentioned by name) competition has no choice but to accept even greater power consumption:
In the end, performance matters, but even if our designs are more energy efficient, that doesn’t mean we won’t increase power consumption when the competition is doing the same. It’s just that she has to push him way higher than we do.
How big the difference in terms of performance and power consumption will be in the end can only be shown by independent tests once the new graphics cards have been released. They are currently expected around the last quarter of 2022.
What is the expected consumption?
The rumors so far indicate that the maximum consumption could increase by around 100 watts with the RTX 4000 generation. So the RTX 3080 is at 320 watts, while for the RTX 4080 proud 420 watts in the room.
For a rough classification, the following table shows the official information on the power consumption of Nvidia’s XX80/X80 models over the past ten years. The 320 watts of the RTX 3080 were already high in comparison, but with 420 watts the series would reach other dimensions:
model | power consumption | Release |
---|---|---|
RTX 4080 | 420 Watt (?) | 2022 (?) |
RTX 3080 | 320 Watt | 2020 |
RTX 2080 | 215 Watt | 2018 |
GTX 1080 | 180 Watt | 2016 |
GTX 980 | 165 Watt | 2014 |
GTX 780 | 250 Watt | 2013 |
GTX 680 | 195 Watt | 2012 |
And what about AMD?
For the new graphics cards of the RX 7000 series, there are also consumption values in the range of over 400 watts. At least there were no clear signs of a significantly better position for AMD here, assuming similar performance. It is nevertheless conceivable that AMD does not have to go quite as far as Nvidia in terms of power consumption with the new chiplet design.
Even more nebulous, but no less exciting, is the look into the future, i.e. for RTX 5000/RX 8000 or even RTX 6000/RX 9000. Even if neither AMD nor Nvidia know exactly what the power consumption of these upcoming GPUs will be like , the signs are currently not pointing in a particularly positive direction.
Also not to be forgotten: A higher power consumption also places greater demands on the cooling. In this article you will find out how you have a good chance of optimizing your current graphics card in terms of efficiency and required cooling capacity:
I do this with every graphics card – and you should do it too
How do you assess the situation? Will the power consumption of GPUs continue to rise steadily over the next few years? And if so, do you see that as a problem or does the power consumption of your PC only play a minor role for you? Please let us know!