Nvidia’s latest AI chip will cost more than $30,000, CEO says – CNBC

author
2 minutes, 14 seconds Read
  • Nvidia’s next-generation graphics processor for artificial intelligence, called Blackwell, will cost between $30,000 and $40,000 per unit, CEO Jensen Huang told CNBC’s Jim Cramer.
  • The price suggests that the chip, which is likely to be in hot demand for training and deploying AI software such as ChatGPT, will be priced in a range similar to that of its predecessor, the H100.
Nvidia CEO on the next generation of semiconductors and computing

Nvidia’s next-generation graphics processor for artificial intelligence, called Blackwell, will cost between $30,000 and $40,000 per unit, CEO Jensen Huang told CNBC’s Jim Cramer on Tuesday on “Squawk on the Street.”

“We had to invent some new technology to make it possible,” Huang said, holding up a Blackwell chip. He estimated that Nvidia spent about $10 billion in research and development costs.

The price suggests that the chip, which is likely to be in hot demand for training and deploying AI software such as ChatGPT, will be priced in a range similar to that of its predecessor, the H100, known as the Hopper, which cost between $25,000 and $40,000 per chip, according to analyst estimates. The Hopper generation, introduced in 2022, represented a significant price increase for Nvidia’s AI chips over the previous generation.

Later, Huang told CNBC’s Kristina Partsinevelos that the cost isn’t just about the chip but also about designing data centers and integrating into other company’s data centers.

Nvidia announces a new generation of AI chips about every two years. The latest, like Blackwell, are generally faster and more energy efficient, and Nvidia uses the publicity around a new generation to rake in orders for new GPUs. Blackwell combines two chips and is physically larger than the previous generation.

Nvidia’s AI chips have driven a tripling of quarterly Nvidia sales since the AI boom kicked off in late 2022, when OpenAI’s ChatGPT was announced. Most of the top AI companies and developers have been using Nvidia’s H100 to train their AI models over the past year. For example, Meta is buying hundreds of thousands of Nvidia H100 GPUs, it said this year.

Nvidia does not reveal the list price for its chips. They come in several different configurations, and the price an end consumer such as Meta or Microsoft might pay depends on factors such as the volume of chips purchased, or whether the customer buys the chips from Nvidia directly through a complete system or through a vendor such as Dell, HP or Supermicro that builds AI servers. Some servers are built with as many as eight AI GPUs.

On Monday, Nvidia announced at least three different versions of the Blackwell AI accelerator — a B100, a B200, and a GB200 that pairs two Blackwell GPUs with an Arm-based CPU. They have slightly different memory configurations and are expected to ship later this year.

This post was originally published on this site

Similar Posts