Until recently, the goal was to achieve a stable frame rate of 60 frames per second. However, the appearance of high frequency panels has become increasingly important in two different markets: virtual reality and eSports. And it is that it is shown that despite the fact that we cannot see more than one frame rate, our brain reacts to certain stimuli. That’s why we already have the first 500Hz NVIDIA G-SYNC monitor and it’s from ASUS. So it’s not a promise, since they showed it for the first time on an ASUS ROG Swift monitor. Want to know the details?
There is a myth that our eye cannot see more than 60 frames per second. In reality, what we see is what the brain processes and has been shown to be able to respond to high-speed stimuli. It’s not common to face them on a daily basis, but you can ask any racing driver. The thing is, in the world of esports, where getting your brain to receive the necessary information at the right time is the key to winning games. This is called advance vision and it is our brain’s ability to react in a very short time. Well, that’s what high-frequency monitors are based on, and NVIDIA thinks it should be
ASUS ROG Swift, the first 500Hz G-SYNC compatible monitor
A speed of 500 Hz on a monitor means that the time it takes for the image to change is only 2 milliseconds. Which beats the top speed, but that’s what the folks at ASUS have achieved with NVIDIA in the form of a new G-SYNC monitor. A monitor that can display up to 500 frames per second. Of course, the entire frame generation process, from moving the mouse to displaying the pixels on the screen, should take less than 2 milliseconds in total.
the ASUS ROG Swift monitor with 500Hz G-SYNC it’s a TN-type monitor with 1080p resolution
One of the most important points of the AI is the motion prediction, which allows us to create frames taking into account the movement of the different elements on the screen of the previous frame. At these speeds, our eye doesn’t have time to look at the exact level of detail. We therefore believe that NVIDIA uses Deep Learning algorithms running on its graphics card to automatically generate images at such high speeds. These are interleaved with those generated by the GPU as usual.