Today we are going to see the consumption of a high-end and very powerful computer. How much does this affect our monthly bill and if there really is a difference to a more normal bill that is not used to play.
How much do you consume?
The first thing you need to know is that all components consume power, not just the graphics card or the processor, even the RAM memory needs energy to work. By connecting a USB key you also generate overconsumption so the difference you can have with what we are going to talk about now will depend on your use.
We analyze a standard gamer PC, that is, an above average computer, with good graphics and other components, but without reaching something professional, we will see in another separate article . We want to focus on consuming what you might have at home. Since, oddly enough, the difference between one and the other is far greater than you might think.
Most of the energy comes from the processor and the graphics card, logically the more cores and power your PC has, the more it will spend.
A high mid-range CPU from Intel like the i7-11700K would give us an average power consumption of around 130w/h on its own, whereas something with a standard desktop range like the i5-10400 would be 65w/h.
Same for the graphics card. An RTX 3090 Ti can reach 450w/hwhile an integrated graphics card in a low-end computer does not exceed 100w/h.
The rest of the components consume practically the same in any computer, connecting a flash drive as we said at the beginning also costs money, but this is neither relevant nor does it change because we have a PC better or worse.
In summary, we can make a calculation of a good computer with a high-end processor, as well as a good graphic and we will arrive at 600-700kw/h very easily if we use it in a powerful game.
But it is that the thing does not stop there, we are missing a very important piece of information. The monitor. And it is that this is another information that makes a lot of difference between a gaming screen with hertz and very high refresh rates, as well as its resolution, compared to a standard screen of any other user . A good monitor will exceed 50kw/h
But hey, so how much does it cost? Analyzing the previous case with high-end parts and assuming that a gamer plays an average of 5/6 hours a day, this would give us a result of around 4-5kw/day, which would mean around $1 per day, or what amounts to the same, $30 per month or $365 per year. It’s not outrageous, but a family that pays around $50 for electricity per month would see its bill rise to $80 if it has a son who is a fan of video games, we’re not going to say that he is more so. It is something noticeable.
From this comes the fact that many cryptocurrency mining farms had to close because the cost of electricity was higher than what they produced.