Nvidia results will give Wall Street a glimpse of AI

Nvidia results will give Wall Street a glimpse of AI demand in 2024 –

Jensen Huang, president of Nvidia, holds the Grace Hopper superchip CPU for generative AI at Supermicro’s keynote presentation during Computex 2023.

Walid Berrazeg | Light rocket | Getty Images

When Nvidia reports third-quarter results on Tuesday, analysts expect revenue growth of over 170%.

As if that wasn’t astonishing enough, the company’s fourth-quarter guidance is expected to show an even bigger number, according to LSEG estimates: nearly 200% growth.

Ahead of Thanksgiving, Wall Street will be closely scrutinizing the company at the center of this year’s artificial intelligence boom.

Nvidia’s stock price has risen 237% in 2023, far outperforming all other members of the S&P 500. The market capitalization is now $1.2 trillion, significantly higher than that of Meta or Tesla. Any hint in the earnings release that enthusiasm for generative AI is waning, that some big customers are switching to AMD’s processors, or that China restrictions are having a negative impact on business could be a foregone conclusion for a stock in such crisis. mean trouble.

“Expectations are high heading into NVDA’s third-quarter 2024 earnings release on November 21,” Bank of America analysts wrote in a report last week. They have a Buy rating on the stock and said they “expect a beat/raise.”

However, they cited China restrictions and competition concerns as two issues that will draw investors’ attention. In particular, AMD’s emergence into the generative AI market represents a new dynamic for Nvidia, which has largely had the AI ​​graphics processing unit (GPU) market to itself.

AMD CEO Lisa Su said late last month that the company expects GPU sales to be around $400 million in the fourth quarter and more than $2 billion in 2024. The company announced in June that the MI300X, its most advanced GPU for AI, would be shipping to some customers this year.

Nvidia is still by far the market leader in GPUs for AI, but high prices are a problem.

“NVDA must forcefully counter the narrative that its products are too expensive for generative AI reasoning,” Bank of America analysts wrote.

Last week, Nvidia introduced the H200, a GPU designed for training and deploying the types of AI models that are driving the generative AI explosion, enabling companies to develop smarter chatbots and transform simple text into creative graphic designs to convert.

The new GPU is an upgrade from the H100, the chip OpenAI uses to train its most advanced large language model, GPT-4 Turbo. According to an estimate by Raymond James, H100 chips cost between $25,000 and $40,000, and thousands of them must work together to create the largest models in a process called “training.”

The H100 chips are part of Nvidia’s data center group, whose revenue rose 171% to $10.32 billion in the second quarter. That accounted for about three quarters of Nvidia’s total revenue.

Analysts expect data center growth to nearly quadruple to $13.02 billion in the third quarter, from $3.83 billion a year ago, according to FactSet. Total revenue is expected to rise 172% to $16.2 billion, according to analysts surveyed by LSEG, formerly Refinitiv.

According to current estimates, growth will peak in the fourth quarter at around 195%, LSEG estimates show. Expansion will remain robust throughout 2024 but is expected to slow each quarter of the year.

Executives can expect to answer questions on the earnings call related to the massive restructuring at OpenAI, the creator of the chatbot ChatGPT, which has been a key catalyst for Nvidia’s growth this year. On Friday, OpenAI’s board announced the sudden firing of CEO Sam Altman over disputes over the speed of the company’s product development and the focus of its efforts.

OpenAI is a big buyer of Nvidia’s GPUs, as is Microsoft, OpenAI’s biggest supporter. After a chaotic weekend, OpenAI said Sunday night that former Twitch CEO Emmett Shear would lead the company on an interim basis, and shortly afterward, Microsoft CEO Satya Nadella said Altman and ousted OpenAI chairman Greg Brockman would be joining in to lead a new advanced AI research group.

Nvidia investors have so far dismissed China-related concerns despite their potential importance to the company’s business. The H100 and A100 AI chips last year were the first to be hit by new US restrictions aimed at curbing sales to China. Nvidia said in September 2022 that the U.S. government would continue to allow development of the H100 in China, which accounts for 20% to 25% of its data center business.

The company has reportedly found a way to continue selling to the world’s second-largest economy while complying with U.S. regulations. The company will deliver three new chips based on the H100 to Chinese manufacturers, Chinese financial media Cailian Press reported last week, citing sources.

Nvidia has historically avoided providing annual guidance, preferring to only look ahead to the next quarter. But considering how much money investors have poured into the company this year and how little else there is to follow this week, they’ll be paying close attention to CEO Jensen Huang’s tone on the conference call for signs of that Recognizing that generative AI is on the rise may be waning.

REGARD: EMJ’s Eric Jackson expects a good report from Nvidia