Nvidia has made an AI system called ChipNeMo that tries to speed up the making of its GPUs.
Nvidia's vice president of applied deep learning research, Bryan Catanzaro told The Wall Street Journal that making GPUs can be hard work. A chip usually takes about 1,000 people to make, and each person needs to know how different bits of the making process fit together.
That's where ChipNeMo can help. The AI system is run on a big language model — built on top of Meta's Llama 2 — that the company says it trained with its data. In turn, ChipNeMo's chatbot feature can answer questions about chip design, such as questions about GPU architecture and the making of chip design code, Catanzaro said.
So far, the results are promising. Since ChipNeMo was shown off last October, Nvidia has found that the AI system has been handy in teaching junior engineers to design chips and summing up notes across 100 different teams, according to the Journal.
Nvidia's efforts to speed up GPU making come as companies want to get their hands on its highly wanted chips to get ahead in the AI wars. Meta, which has brought out AI products such as its big language model Llama 2 and its AI-powered Ray-Ban Smart Glasses in the last year, is on track to get a total of 600,000 GPUs, including Nvidia's A100s and other AI chips, by the end of 2024.
The race to make the best AI products is suitable for Nvidia. The chip giant's stock went up by four per cent on Monday to a record high, and analysts from Goldman Sachs expect the gains to continue until the first half of 2025.
Nvidia isn't the only group trying to use AI to speed up the design stage of chips.
Last July, Google's DeepMind made an AI system that the company said could speed up making its latest version of its custom chips, per the WSJ.
A few months later, Synopsys, a software giant, brought out an AI tool made to help chip engineers work faster. Universities like New York University are also researching how creative AI can be used to design chips quickly.