Published in AI

AI will be Nvidia money spinner

by on02 March 2023


Will require 30,000 Nvidia graphics cards

TrendForce estimates that OpenAI's ChatGPT will eventually need over 30,000 Nvidia graphics cards.

The cards will be Nvidia's compute accelerators, such as the A100, so gamers should not have any problems.

Using the A100 (Ampere) accelerator. TrendForce worked out that ChatGPT required around 20,000 units to process training data. The number will increase significantly, potentially over 30,000 units, as OpenAI continues to deploy ChatGPT and the company's Generative Pre-Trained Transformer (GPT) model commercially.

The A100 costs between $10,000 and $15,000 so that will mean Nvidia will get at least $300 million in revenue. The number may be slightly lower since Nvidia will likely give OpenAI a discount. OpenAI will likely purchase the A100 individually and stack them into clusters. 

Nvidia has already started shipping the H100 (Hopper). Hopper delivers up to three times higher performance than its predecessor and scales even better and offers up to nine times higher throughput in AI training. However, it is a little too pricey with a price tag of $32,000.

Intel and AMD also offer rival AI accelerators so Nvidia will not have the market to itself, but it does look like it will end up making a fair amount from the new goldrush.

 

Last modified on 02 March 2023
Rate this item
(3 votes)

Read more about: