The GPU division of AMD was born in 1985 as Array Technology Incorporated, the company specializing from the beginning in the creation of graphics chips for use in PC graphics cards, the first being the graphics solution ATI Rev 3, which it was a card with the ability to reproduce graphics in CGA and Hercules modes.
Due to this versatility, it became very famous at the time as it avoided the use of multiple types of graphics cards in a PC depending on the type of monitor.
ATI Wonder Series
ATI started to make a name for itself thanks to its EGA Wonder, a graphics card under the EGA standard which combined the Graphics Solution Rev 3 chipset with an EGA chip from Chips Technologies, so that this card could handle various graphics standards without having to have multiple adapters, which gave enormous compatibility in the face of the enormous confusion of PC graphics standards.
The card was replaced by the ATI VGA Wonder, which included a VGA chipset developed by ATI itself and maintained compatibility with the rest of the previous standards.
ATI Mach Series
In 1990, ATI continued to make clones of IBM standards, with the ATI Mach 8 they made a clone of the IBM 8514 / A graphics chip, which was the start of the XGA standard, this standard allowed resolution of 1024 × 768 pixel screen, with a palette of 256 colors, or 640 × 480 with 16 bits per pixel (65,536 colors).
The peculiarity of this standard is that it was born as a parallel extension of VGA for professional monitors. Its biggest advantage? It was the addition of blitter-style graphical operations like those of the Commodore Amiga. So, visual operations like drawing lines, copying blocks, or filling the color of a shape ended up being possible without having to use CPU cycles for it.
Once the pre-VGA graphics standards were completely obsolete, ATI released the ATI Mach 32, which included a VGA core, thus unifying the ATI Wonder and ATI Mach series into one, although ATI continued to use the ATI Wonder name for some products. apart from its main line of graphics cards, such as video decoders.
Starting with the ATI MACH64, they unified the VGA chipset and MACH into a single GPU.
ATI Rage Series
The enormous success of 3Dfx’s Voodoo Graphics and the emergence of games based on real-time 3D graphics have pushed companies like ATI to catch up to avoid disappearing from the graphics card market.
The problem with most graphics cards is that processors, since the standardization of the Intel Pentium in homes, rendered 3D gaming scenes much faster than with graphics cards of the day.
ATI’s solution was to modify its ATI MACH64 chipset to make a series of changes that would make its line of graphics cards competitive, the changes were as follows:
- The possibility of processing textures and filtering them has been added, for this a 4 KB texture cache has been added.
- Adding a Triangle configuration or a raster unit.
However, the first ATI Rage came in late and while they had the power of a Voodoo Graphics, they paled compared to the Voodoo 2 and Riva TNT, ATI’s response was none other than to increase the number of texture units from 1 to 2 and place a 128-bit bus, which they sold under the name ATI Rage 128.
Another card that stood out within the ATI Rage line was the FURY MAXX, which was the first ATI graphics card to implement what in the future would end up being Crossfire technology, which allowed ATI to place two Rage 128 chips in one card.
ATI Radeon 7500
ATI ditched the Rage brand by starting with the first Radeon, which was the company’s first DirectX 7 compatible card. It was using the ATI R100 GPU which was originally named Rage 7, in reality it was nothing more than a Rage 128 with a built-in T&L unit and is that full DirectX 7 support, involved implementation units that calculate scene geometry when rendering 3D scenes in real time.
Thus, the Radeon 7500 at heart was nothing more than a Rage 128 with the fixed function units for the geometry calculation. It didn’t perform as well as the first and second generation GeForces that came out at the same time, but it was the turning point for the company to make the necessary change.
ATI Radeon 9700
The ATI Radeon 9700 under the R300 chipset has become one of the most important graphics cards in ATI history, if not the most important of all, and is equivalent to the first NVIDIA GeForce in terms of impact.
In order to be competitive, AMD bought the startup ArtX, which was founded by the same former engineers from Silicon Graphics who had worked on the 3D technology of the Nintendo64 and Nintendo GameCube consoles. The result of the purchase? The R300 GPU, with which ATI was at the forefront.
Plus, they had the perfect storm, as the launch of the Radeon 9700 coincided with NVIDIA’s biggest slippage in all of its history, the GeForce FX, the ATI Radeon legend had started and the rivalry between NVIDIA and ATI started from this point in history.
ATI Radeon HD 2000 Series
ATI’s first graphics card with unified PC shaders appeared later than the GeForce 8800, although it was implemented by ATI for the first time in history with the Xbox 360 GPU.
However, the architecture was completely different and you cannot compare the Microsoft console GPU with the architecture of the R600, which in its most powerful version the Radeon HD 2900 had around 320 ALUs, which under the Terascale architecture is 64 VLIW5 units and therefore 4 calculation units.
However, the ATI R600 GPUs on which the Radeon HD 2000 was based were a disappointment, as they were unable to compete with the NVIDIA GeForce GTX 8800. A year later, ATI released the HD series. 3000 under the 55nm node and eliminated some design flaws in the R600 architecture, but it wasn’t a major leap.
ATI Radeon HD 4870
Following the fiasco of the R600 architecture variants, AMD decided to upgrade the R600 architecture from 320 ALUS to 800 ALU, thus creating the R700 architecture which started with the ATI Radeon HD 4000 in which the HD 4870 became the absolute queen, again ATI had regained the throne in terms of performance.
ATI Radeon HD 5000 Series
Instead of designing a completely new architecture for DirectX 11, ATI decided to release the R800 chipset, which was an R700, but optimized and improved for DirectX 11, well, it wasn’t actually optimized for DirectX 11 but ATI is released Terascale 2 to avoid trouble.
ATI Radeon HD 7000 Series
The HD 7000 series was the first to use the Graphics Core Next architecture on which AMD GPUs were based on multiple generations of graphics cards.
The GCN architecture was a total change in the organization of computing units compared to the Terascale architectures, since they went from 16 VLIW4 units to 4 SIMD16 units.
It is the longest AMD architecture in the history of the company, from the Radeon HD 7000 that appeared on the market in 2012 to the Radeon VII released in 2019. The architecture has been replaced by the new architecture. . RDNA 2, which is in the second generation and has a third on the way.
Table of Contents