On what basis do we assert this? Well the fact that oddly if we do an underclocking or power limiting exercise on the RTX 40 they have a massive boost in performance per watt. Which makes us think they were designed for this scenario. Most likely, NVIDIA already had the chip ready in Verilog or VHDL long before Ethereum ended and it was too late to back out.
Either way, the majority of the public aren’t going to spend more than 500 on a graphics card, or in other words, they’re not going to invest more than what a next-gen console costs and you have to go of those prices that were marked just at the start of the COVID pandemic, when its effects on the global economy were unknown.
How does this affect the global market?
Since data kills a story, it’s best to draw a chart where we can see how quarterly gaming GPU sales are falling. Thus, we can quantify with this the blunder of NVIDIA and AMD to design new generations thinking of mining RIGs and not the average PC Gamer. They could have aimed at both markets calmly, but they didn’t want to.
The problem of over-engineering
The concept, while it may sound like an otherworldly language to you, actually refers to when a piece of hardware has features that bring no value to the end user, but also incur additional cost. Ray Tracing is often accused of this, but it’s nothing more than AMD propaganda. The reality is that we have a series of functions, in particular a good part of those of DirectX 12 Ultimate, which make us wonder why they appear in the technical specifications if they are not used in any current game at the moment.
Typically, when a new game uses new graphics technology, it appears alongside the hardware that provides it. It makes no sense to see things like Mesh Shaders or DirectStorage being completely absent from games and then having an impact on the final price of the graphics card. After all, there are extra transistors inside the chip and a larger chip size, which ultimately makes it more expensive.
Out of curiosity, NVIDIA increased the L2 cache capacity on its RTX 40s to 16x, this is because the number of cache hits was very low with the RTX 30s. The goal? Reduce the energy cost of operations. The effects? A much larger chip which has a negative impact on its costs. And the same can be said in the case of RDNA 2 and 3, where their additional level of cache aims precisely for this purpose.
So what is the solution to the problem?
It is around Christmas that many people take advantage of the holidays to offer themselves or others an update for their PC. However, neither AMD nor NVIDIA release new graphics cards for the mainstream market, but for a high end with a purchasing power that they maintain throughout the year. While having a Halo product is important for marketing, NVIDIA and AMD’s blunder may come down to bad timing when launching their new graphics cards.
It is in the last quarter of the year that most companies launch their products for the masses with the effects of pure consumerism in their veins. However, we can see that the main manufacturers of GPUs for graphics cards continue to believe that the bulk of the market is in 1000 euro cards. The reality of the market is very different and we must not forget, even if they have some, that graphics cards are sold to play video games and we repeat ourselves. There are devices called video game consoles that perform the same function and compete with graphics cards.
So the solution to the NVIDIA and AMD blunder is simple, since the vast majority of users continue to use Full HD monitors, so offer them a graphics card capable of playing all current games, even the most demanding ones without problems. , at Full Resolution HD, with the best possible quality and more than 60 frames per second. The first company to do so will eat the market.
If it’s so simple, why did NVIDIA and AMD screw up?
Because they fear that if midrange or low-midrange graphics cards are good enough, they will end up affecting the high ranges. However, we each have limited purchasing power and limiting the cheapest performance ranges turned out to be an incredible mistake.
Just do a mental exercise, which will help you understand the blunder of NVIDIA and AMD. Think back to the past and tell them that a graphics card from 2006 will be good enough to play games in 2013. Obviously your old self will look at you in disbelief. Well, that’s what we’re living, all of this because we’re living a new generation of graphics cards, designed to fill the coffers of both major manufacturers through mining as the word PC Gaming comes out of their mouths .
And no, we’re not looking at the glass half empty, but that things would be resolved and the market wouldn’t have stagnated had they covered the need to have a graphics card to game at a reasonable price.
Table of Contents