In recent years, we have been able to connect our PC to a HDTV, all thanks to the fact that HD resolutions are quite good for a PC. What has been your evolution?
What are the differences between HD resolutions?
Obviously, the first difference is in the amount of pixels on the screen, because despite the fact that all three resolutions have an aspect ratio of sixteen ninths, where the ratio of horizontal to vertical resolution is 16: 9. each of these resolutions, the number of pixels varies.
All of the resolutions that we are going to cover in this article are TV standards and not VESA standards, so we will not cover intermediate resolutions and aspect ratios that may be given on different TV monitors, but are not. not compliant. Standard.
Standardization of HD resolutions
The first resolution to be standardized was HD Ready or known as 720p, but this way of naming through its vertical resolution is not what we initially thought. For this, we must take into account the standard VGA resolution, that of 640 pixels per line, which has a horizontal frequency of 31 KHz, double that of the 15.5 KHz of the American television standard or NTSC.
The idea behind the high definition television standard was that television content could be played in this format. We are going to abandon the European model due to the fact that the consumer electronics industry in Europe has been burnt out for a long time and it is the Japanese and American manufacturers who set the standards.
It was therefore enough for them to quadruple the horizontal frequency of NTSC and to double that of VGA for the first HD standard which was 720p or HD Ready in order to reach the horizontal frequency. This is what allowed the first HD Ready TVs to have a combination of typical TV and computer ports.
From four thirds to sixteen ninths
The other decision was the choice of the 16: 9 widescreen format and that has a practical motivation. Why was the 4: 3 aspect not continued? Let’s not forget that when Thomas Edison invented film for storing photos and movies, he came to the conclusion that the best aspect ratio was precisely 4: 3 and that this standard continued to be used for over a century. .
The origin of the panoramic format comes from the world of cinema, when television entered homes in the 1950s it was in competition with cinema, so the solution of the vast majority of production companies was to use a format different screen, that is, a panoramic. But at that time several different formats were used, such as the Panavision with an aspect ratio of 2.20: 1 or the famous CinemaScope with a ratio of 2.39: 1.
It was when TV panel technology started to be ready for higher resolutions that they decided to sit down and decide on the new standard. One of the promoters, Dr Kerns Powers, took several cut-out boxes of the most important aspect ratios and lay them on top of each other with a common center.
The observation was that the larger rectangle, which contained all the rectangles inside, had the same aspect ratio as the innermost rectangle. What was it? 1.78: 1 or 16: 9. From there, 16: 9 was the standard chosen, so 720p was the resolution chosen for high definition.
The evolution to Full HD
The next standard to choose was Full HD or better known as 1080p resolution, but as we said before, the choice of initial resolutions was not made with vertical resolution as the benchmark, but resolution. horizontal. In Full HD, this is 1920, which means it is 50% higher than the horizontal resolution of HD Ready.
The reason is that the chips in the TVs responsible for receiving the video signal could be improved since they consumed very little, so they decided to create a higher resolution version of the standard. But that was mainly because they realized that with Blu-Ray, they could put a full-time movie in Full HD resolution without problems and without compromising anything. Full HD was therefore born accidentally when they saw that they could push the machinery a little more in all aspects.
The other reason is that at the distances at which people usually watch TV, 1080P marked a difference with 720P in terms of picture quality.In addition, LCD panels, despite the need for density of pixels higher, had become the same as these. PC and the switch from 720p to 1080p was done at a very low cost.
The latest in HD resolutions: 4K
The last standard resolution used in TVs is 4K or UHD, its history is different from the others and has a very different origin from the other two resolutions, since it was created several years later.
A big step forward in LCD screens was the adoption of what Apple commercially called Retina screens with its iPhone 4. The special thing? The density of pixels per inch has become 300: 1, the same as in professional photographs, a limit that at the distance at which we normally keep a smartphone if we increase the resolution, the difference would not be noticed. How did this affect the televisions? Well, it was allowed to start creating panels with higher pixel density on a large scale while reducing costs.
The choice of 2160 pixels of vertical resolution came from the fact that it is the same resolution of the same type used in cinema, only that in movies it is 4096 pixels horizontally and in televisions it is 3840 pixels. So that all means four times the resolution, which translates to the biggest leap that has ever existed, since from standard TV to HD Ready it was 3: 1, from HD Ready to Full HD it was 2.5: 1, but from Full HD to 4K it was 4: 1.
HD resolutions on PC today
HD resolutions are the standard on PC today, 1080p is the standard for most PC users. But, having a resolution with 4 times more pixels in 4K compared to Full HD therefore means a series of hardware requirements which are as follows:
- The bandwidth to transport pixel information from GPU to GPU is four times higher.
- With four times the pixels, we have four times the compute requirements, not to mention improvements in image quality that require more processing power.
- It also means quadruple the video memory required.
Of the three points, the third is easier to achieve, but not the fourth, because VRAM has not evolved fast enough and there is stagnation in the evolution of GPUs, which has led them to have to develop techniques. super-resolution based on AI. where the GPU only needs to render a lower source image resolution and then scale.
Obviously 4K for the world of cinema and television is not a problem, a codec capable of reproducing videos in 4K costs a few dollars to implement them, not to say a few cents. Therefore, 4K TV content is proliferating and if we add the ease of porting cinema to TV with the same vertical resolution, that says it all.
Table of Contents