To date, Amazon’s servers are responsible for processing – among other things – Alexa Assistant with hardware based on NVIDIA GPUs. Currently, Amazon has already upgraded approximately 80% of its Artificial Intelligence infrastructure with Elastic Compute Cloud (EC2) Inf1, using new chips.
Variations of AWS. Compared to previous G4 servers running the traditional NVIDIA GPU, Inf1s are able to contribute to the file 30% more performance reduce costs by 45%, and according to Amazon about this change they have the best AI infrastructure for language management and speech processing in the market.
NVIDIA and Amazon cut ties: here’s how Alexa works
Amazon Alexa works like this: a product type “speaker” has no processing function, it simply has a microphone and a speaker that picks up users’ orders and sends them the same to customers. Amazon AWS servers
This means that Amazon Echo devices are actually «obesity“Or, as they say in computer jargon,” silent devices, “because their only function is to act as intermediaries between the user and Amazon’s cloud servers. This is a big problem, and that’s when the device goes offline it becomes completely useless
AWS Inferentia chips are made up of four cores of NeuronCores, each with Amazon’s so-called “High Performance Systolic Array Matrix Multiply Engine”, which means less or less which means that each of these NeuronCore is made up of a number of data processing compounds ( DPU) which processes data in a straightforward manner. In addition, each Inferentia chip has a large cache memory that improves latency to provide faster responses to the user.
An agreement that will generate a loss for NVIDIA
This decision by Amazon does not appear to have had any contact with them and their dissatisfaction with NVIDIA in any operation or stock area. They seem to have decided to use their own chips (or transferred to a third party, anyway) for the work they do instead of using GPUs, which as we know NVIDIA is more focused on the performance of AI functions and is no longer a pure and robust GPU.
Amazon’s savings, according to their estimates, are 45%. This includes the cost savings to pay for NVIDIA use and the use of these programs, as they now use more energy-efficient systems to communicate. In short, they get to work and reduce costs, so who can say no to that?
In any case, this “break” of the relationship will be a good thing for NVIDIA, who released a good quarterly revenue as a result. We’ll have to wait for the green financial report for six months to find out the impact this has had on the company’s finances, but it seems to be “going to hurt.”