Zuckerberg didn't say how many of the graphics processing units (GPUs) the company has already purchased, but the H100 didn't hit the market until late 2022 and the quantity was limited. Analysts at Raymond James estimate that Nvidia is selling the H100 for $25,000 to $30,000, but they can go for over $40,000 on eBay. If Meta were to pay at the low end of the price range, the outlay would be nearly $9 billion.
Additionally, Zuckerberg said that Meta's computing infrastructure will contain “nearly 600,000 H100 equivalents of computing power when other GPUs are included.” In December, tech companies including Meta, OpenAI and Microsoft announced they would be using AMD's new Instinct MI300X AI computer chips.
Meta needs these high-performance computer chips to advance research in artificial general intelligence (AGI), which Zuckerberg says is a “long-term vision” for the company. OpenAI and Google's DeepMind unit are also exploring AGI, a futuristic form of AI comparable to human-level intelligence.
Meta's chief scientist Yann LeCun emphasized the importance of GPUs at a media event in San Francisco last month.
″[If] “You think AGI is in there the more GPUs you have to buy,” LeCun said at the time. Referring to Nvidia CEO Jensen Huang, LeCun said: “There is an AI war, and it is providing the weapons.”
In Meta's third-quarter earnings report, the company said total costs for 2024 will be between $94 billion and $99 billion, in part due to expanding data processing.
“In terms of investment priorities, AI will be our largest area of investment in 2024, both in technology and computing resources,” Zuckerberg said in the conference call with analysts.
Zuckerberg said Thursday that Meta plans to “responsibly open source” its yet-to-be-developed “general intelligence,” an approach the company is also taking with its Llama family of large language models.
Meta is currently training Llama 3 and is also ensuring that its Fundamental AI Research Team (FAIR) and GenAI Research Team work more closely together, Zuckerberg said.
Shortly after Zuckerberg's post, LeCun said in a post on X: “To accelerate progress, FAIR is now a sister organization to GenAI, the AI product division.”
—CNBC's Kif Leswing contributed to this report
REGARD: The Dark Horse of AI: Why Apple Could Win the Next Evolution of the AI Arms Race