One of the hardware design constraints that hardware designers face is power consumption. We always seek to be able to achieve a certain performance within the framework of a specific budget. The problem? Users are always looking for the graphics card that gives us the most frames in our favorite games, as well as better graphics quality. This means that from time to time the amount of energy consumed must increase.
Why will your graphics card increase your electricity bill?
Of all the components found inside your computer, the one that has the most impact on the electricity bill is the graphics card. Not in vain, while in the case of a central processor consuming about 200 W it is considered exaggerated, in the case of the graphics card it is considered normal. We even have models close to 400 W and soon we will have the use of the ATX12VO or 12+4 pin connector which can double the consumption and therefore double the cost of our graphics card.
The central GPU will increase its power consumption
The main component of any graphics card is its central GPU, which compared to a central processor today is made up of dozens of cores and there are even cases that already number in the hundreds. In any case, we must start from the idea that generating graphics is a task that requires great parallelism. For example, if we have to generate an image in Full HD, we will have 2 million pixels to process simultaneously and if we are talking about 4K resolution, the thing goes up to 8 million.
Because each pixel or other graphical primitive to compose the scene is independent of the rest, this is where a design with a large number of cores becomes important. Of course, with a fixed budget in terms of energy consumption. So, generation after generation, engineers from Intel, AMD, and NVIDIA have to rack their brains to fit more cores into the same budget and achieve higher clock speeds. This was possible thanks to the effects of adopting a new manufacturing node, however, it became the limiting factor and if we want to achieve certain performance rates, we must increase the consumption of graphics cards.
To understand this better, think of a neighborhood of houses that are renovated from time to time. In which gradually the number of houses increases not only, but under the same power plant. The time will come when it will be necessary to renew it in order to supply electricity to all.
VRAM also plays an important role
Besides the graphics chip, we have a series of smaller chips that are connected to the GPU and serve as memory to store data and instructions. This type of memory differs from those we use as central RAM in that they place more importance on the amount of data they can transmit per second than on the data access time. Well, that’s one of the power consumption issues and that’s going to drive us to adopt big caches within the core graphics chip to reduce the power cost of data access.
The problem is that the volumes of necessary information that the graphics card has to manage are so large that it is impossible to place such a large amount of memory inside the chip. So we continue to need external memory. Although the goal in the future is to increase the size of the last level caches to the point where almost 100% of the instructions have a consumption of less than 2 picoJoules per transmitted bit. What do VRAM memories consume? Well, currently 7 picoJoules per transmitted bit, but things could go to 10 in the future.
Taking into account that power consumption, measured in watts, is the number of Joules per second, then the problem is very clear. Additionally, as the graphical complexity of games increases and with it the number of cores, the bandwidth required also increases.
Your electricity bill will not increase as much as with your future graphics card
If you’re worried about rising electricity bills due to your graphics card, then let us reassure you. Only high-end models that will be designed to run games at 4K with Ray Tracing enabled or with frame rates in the hundreds will consume more than current graphics cards and therefore use the new 12+4 pin connector. The rest of the ranges will remain in the current consumptions, which range from 75 W to 375 W.
So if your screen is 1080p and even 1440p resolution and you’re looking to play games at 60 FPS, then you don’t have to worry about your graphics card draining too much. Since you won’t need to buy one that draws 450W or more. Rather, it’s the most enthusiastic, those who want maximum performance at all costs, who will have to worry the most about rising electricity bills.
Leave a Reply