PUBLISHED FRI, APR 14 202312:46 PM EDTUPDATED 31 MIN AGO
SHAREShare Article via FacebookShare Article via TwitterShare Article via LinkedInShare Article via Email
- Nvidia’s H100 graphics cards are selling for more than $40,000 on eBay.
- The high-end chips are still essential for training and deploying AI software.
- The prices were noted by 3D gaming pioneer and former Meta consulting technology chief John Carmack on Twitter.
In this article
Follow your favorite stocksCREATE FREE ACCOUNT
Jen-Hsun Huang, president and chief executive officer of Nvidia Corp., announces the EGX Edge Supercomputing Platform during the company’s event at Mobile World Congress Americas in Los Angeles, California, Oct. 21, 2019.
Patrick T. Fallon | Bloomberg | Getty Images
The prices for Nvidia’s H100 processors were noted by 3D gaming pioneer and former Meta consulting technology chief John Carmack on Twitter. On Friday, at least eight H100s were listed on eBay at prices ranging from $39,995 to just under $46,000. Some retailers have offered it in the past for around $36,000.
The H100, announced last year, is Nvidia’s latest flagship AI chip, succeeding the A100, a roughly $10,000 chip that’s been called the “workhorse” for AI applications.
Developers are using the H100 to build so-called large language models (LLMs), which are at the heart of AI applications like OpenAI’s ChatGPT. Running those systems is expensive and requires powerful computers to churn through terabytes of data for days or weeks at a time. They also rely on hefty computing power so the AI model can generate text, images or predictions.
Training AI models, especially large ones like GPT, requires hundreds of high-end Nvidia GPUs working together.
Nvidia also offers a supercomputer with eight GPUs working together called the DGX. Earlier this year, the company announced new services that would allow companies to rent access to DGX computers for $37,000 per month. At that price, the system would use Nvidia’s older A100 processors.
Nvidia says the H100 is the first chip to be optimized for the specific AI architecture underpinning many of the recent advances in AI, called transformers. Industry experts say the more powerful chips will be necessary to build even bigger and more data-hungry models than those that are currently available.