Elon Musk has confirmed that his corporations Tesla and Twitter had been shopping for tons of GPUs when requested to substantiate whether or not he was build up Twitters compute prowess to develop a generative synthetic intelligence venture. In the meantime, the Monetary Instances studies (opens in new tab) that Musk’s AI enterprise will probably be a separate entity from his different corporations, however it might use Twitter content material for coaching.
Elon Musk’s AI venture, which he started exploring earlier this 12 months, is reportedly separate from his different corporations, however might probably use Twitter content material as knowledge to coach its language mannequin and faucet into Tesla’s computing sources, in line with Monetary Instances. This considerably contradicts the sooner report which claimed that the AI venture could be part of Twitter.
To construct up the brand new venture, Musk is recruiting engineers from high AI corporations, together with DeepMind, and has already introduced on Igor Babuschkin from DeepMind and roughly half a dozen of different AI specialists.
Musk can also be reportedly negotiating with numerous SpaceX and Tesla traders about the opportunity of funding his newest AI endeavor, in line with a person with firsthand knowledged concerning the talks, which can affirm that the venture is just not set to be part of Twitter.
In a current Twitter Areas interview, Musk was requested a couple of report claiming that Twitter had procured roughly 10,000 of Nvidia compute GPUs. Musk acknowledged this stating that everybody, together with Tesla and Twitter, are shopping for GPUs for compute and AI nowadays. That is true as each Microsoft and Oracle have acquired tens of hundreds of Nvidia’s A100 and H100 GPUs within the current quarters for his or her AI and cloud companies.
“It looks as if everybody and their canine is shopping for GPUs at this level,” Musk stated. “Twitter and Tesla are actually shopping for GPUs.”
Nvidia’s newest H100 GPUs for AI and high-performance computing (HPC) are fairly costly. CDW sells Nvidia’s H100 PCIe card with 80GB of HBM2e reminiscence for as a lot as $30,603 per unit. On Ebay, this stuff promote for over $40,000 per unit if one needs this product quick.
Just lately Nvidia launched its much more highly effective H100 NVL product that bridges two H100 PCIe playing cards with 96GB of HBM3 reminiscence on every one for an final dual-GPU 188GB resolution designed particularly for coaching of enormous language fashions. This product will definitely price nicely above $30,000 per unit, although it’s unclear at which worth Nvidia sells such models to clients shopping for tens of hundreds of boards for his or her LLM tasks.
In the meantime, the precise place of the AI workforce in Musk’s company empire stays unclear. The famend entrepreneur established an organization known as X.AI on March ninth, Monetary Instances reported citing enterprise data from Nevada. In the meantime, he lately modified the title of Twitter within the firm’s data to X Corp., which can be part of his plot to construct an ‘all the pieces app’ underneath the ‘X’ model. Musk is at the moment the only director of X.AI, whereas Jared Birchall, who occurs to handle Musk’s wealth, is listed as its secretary.
The speedy progress of OpenAI’s ChatGPT, which Elon Musk co-founded in 2015 however now not is concerned with, reportedly impressed him to discover the thought of a rival firm. In the meantime, this new AI enterprise is anticipated to be a separate entity from his different corporations presumably to make sure that this new venture won’t be restricted by Tesla’s or Twitter’s frameworks.