It already owns the world’s largest AI farm

The Boss

PC

It already owns the world’s largest AI farm

Farm, largest, owns, Worlds

For the past few months, we’ve had a kind of AI fever among the world’s largest companies. NVIDIA started before everyone else and that helped it become the most valuable company on the planet, but now everyone is trying to get on the train in time to claim their piece of the pie, and someone like billionaire Elon Musk couldn’t do it less… so what he did was to release huge amounts of money to NVIDIA to create the world’s largest farm.

World’s Largest AI Farm Owned by Elon Musk

Elon Musk’s popular xAI company has officially started training its AI model using the most powerful graphics NVIDIA has, the H100, and not exactly a few because as we already reported a few weeks ago, there are 100,000 units, nothing more and nothing less. Elon Musk proudly announced it at X, calling it “the most powerful AI training cluster in the world.” In the same post, he said that his super cluster will be trained with 100,000 water-cooled NVIDIA H100 graphics on a single RDMA fabric, and he congratulated his xAI team but also NVIDIA for starting the training in Memphis, USA.

After this first announcement, Elon Musk said that the world’s most powerful AI will be ready by December this year, meaning it can be ready after only about 5 months of training. GROK 2 (which is the name of xAI’s AI) would be ready to be released next month, but it is GROK 3, the next iteration, which will be trained using this H100 graphics super cluster, and will arrive in December.

And note that this announcement comes just two weeks after xAI and Oracle canceled their $1 million server deal. xAI had been leasing NVIDIA’s AI chips from Oracle, but decided to build its own mega server, ending the deal with Oracle, which was logically much more powerful.

Data put into perspective

Telling you that Elon Musk built (well, not him, rather with his own money) a mega server cluster with 100,000 NVIDIA graphics cards sounds really stupid, but it’s even more so when you put the data into perspective.

nvidia h100 ia graphics card

Each of the NVIDIA H100 graphics cards costs about $30,000. The GROK 2 AI used 20,000 units, which means the investment in graphics cards alone is about $600 million, and that’s not counting the rest of the infrastructure. GROK 3 uses 100,000 units, so we’re talking about a $3 trillion spend on graphics alone… and again, you have to add in the rest of the infrastructure, so it’s easy to 5 billion.

By the way, we have to comment here on another fact: at the beginning we mentioned that Elon Musk was a bit late to the AI ​​train, and this statement is reinforced by the fact that NVIDIA is about to launch its next generation of AI graphics, the H200, which would arrive in the third quarter of this year. These graphics theoretically offer up to 45% more performance in generative AI, so imagine what Musk could have achieved if he had waited for the H200s to be available…

Leave a Comment