In the market, there are many technologies related to our monitors and televisions. All under trade names such as VRR, Adaptive Sync, AMD FreeSync and NVIDIA G-SYNC. All with the same seemingly usefulness, but with some subtle differences. So let’s look at variable refresh rate technologies.
Early video game systems were so rudimentary that they lacked VRAM and took advantage of the visual persistence that CRT displays relied on to generate the image on the screen at the rate of the electron beam. However, as the cost of video memory decreased, it was converted to rendering first in the framebuffer and then passing it to the video output.
Why do we need variable refresh rates?
This process continues today and because the image is not generated at the same frequency as that of the monitor, they end up producing so-called image artifacts such as tearing. This happens when there is a gap in the signal between the device that outputs the video signal, the graphics card, and the one that generates the image, which is the display or monitor.
The solution to the problems derived from this lack of synchronization? Well, do something the old VGA outputs did, i.e. give the graphics card control over the timing of each frame in terms of horizontal and vertical synchronization with the device that outputs the video signal. This way the signal is fully synchronized and there are no problems arising from it. This measure not only serves to prevent screen tearing, which is the visual error seen in the image above, image stuttering or also called stuttering
What technologies exist for variable refresh rate?
However, instead of one standard, several variable refresh rate technologies have been created in an attempt to solve the problem. This has ended up increasing confusion among buyers, since it not only makes it difficult to buy a monitor, but also the graphics card that we are going to use with our PC. And it is that two types of standards appeared at the same time. On the one hand, those which depend on the type of video interface used (VESA Adaptive Sync and HDMI FreeSync) and on the other hand, technologies which depend on a graphics card manufacturer such as AMD FreeSync Premium and NVIDIA G-SYNC.
So let’s see a review of each of the ones that currently exist so you can tell them apart.
VESA Adaptive Sync
The first of the variable refresh rate technologies we will discuss is that defined by VESA, which ensures that standards in terms of computer monitor specifications are met. Because there is still a separation, more than bureaucratic, with the world of televisions, many manufacturers of the latter do not respect most of the time the standards of the Video Electronics Standards Association.
the Adaptive synchronization it was first included in version 1.2a of DisplayPort and has been retained in later versions of the standard. Therefore, for its use, the device must use this video interface, which, since it is not seen on conventional televisions, means that many games do not take advantage of it, since developers have to ensure that functions are used by as many people as possible. Unfortunately, next-gen consoles lack releases Display Port.
AMD FreeSync
The AM FreeSync TechnologyD is a blatant case of rebranding, since it’s nothing but the Adaptive Sync itself that we talked about in the previous section. So any Radeon graphics card or Ryzen APU can use Adaptive Sync. Only AMD sells it under its own brand.
However, AMD has afforded the luxury of expanding it under the name FreeSync Premium and FreeSync Premium Pro as support for HDR and Low frame rate compensation which is based on add “ghost” frames when the screen refresh rate is below 60 Hz. However, implementing these technologies requires a number of additional components in the monitor circuitry, so they cannot be exploited by monitors that only support Adaptive Sync.
Out of curiosity, some monitors and devices compatible with FreeSync can apply it via their HDMI interface. There are also low cost monitors that can run at a 75Hz refresh rate, but they are fully compatible with FreeSync, but with one problem: they run at 48Hz as the minimum speed rate, so if the GPU retransmits at less than this speed, image artifacts may appear.
G-SYNC
In the middle of a very cynical exercise and seeing how his proposal G-SYNC might disappear from NVIDIA altogether, they came up with a very similar marketing move to AMD. Name the DisplayPort Adaptive Sync of your graphics cards as G-SYNC compatibilitye, in order to mentally link the VESA protocol with its own technology. Thus those of Jensen Huang are as guilty or even more than those of AMD.
And we can say this more due to the fact that a G-SYNC compatible monitor does not mean that you have access to all the features of the G-SYN standard
So G-SYNC and G-SYNC Compatible are not the same although they serve the same purpose, in any case it has forced NVIDIA to evolve its solution beyond what Adaptive Sync can offer and like its rival d ‘AMD adds support HDR, up to 1000 nits in the Ultimate version and one Improved input lag.
VRR or Variable Refresh Rate for HDMI
Under the original acronym of VRR we’re facing the proposal from HDMI.org, so we’re facing the same thing as Adaptive Sync, but the HDMI port. This means that video game consoles will be able to take advantage of monitors and televisions under version 2.1 of the standard. By the way, we note that if your monitor uses an old version of HDMI, then you will not be able to use this variable refresh rate technology.
So it’s still the same functionality as Adaptive Sync, but designed for HDMI output. The problem? While the VESA solution is an integral part of the base standard, the updated HDMI 2.1 requirements left it as a totally optional solution for monitor manufacturers. This means that if the controller of the HDMI output does not support it, then other solutions must be used to implement it, which may mean a reduction in the power of the graphics card since it is the graphics card that must apply. something that would work for the video controller.
The post Why is refresh rate important in gaming monitors? appeared first on HardZone.
Table of Contents