The Journal says that as chip makers have reached the limits of atomic-scale circuitry and the physics of electrons, Moore's Law has slowed, and some say it's over. But a different law, potentially no less consequential for computing's next half century, has arisen.
Huang's Law describes how the silicon chips that power artificial intelligence more than double in performance every two years. While the increase can be attributed to both hardware and software, its steady progress makes it a unique enabler of everything from autonomous cars, trucks and ships to the face, voice and object recognition in our personal gadgets.
Nvidia's Bill Dally, chief scientist and senior vice president of research said that between November 2012 and this May, performance of Nvidia's chips increased 317 times for an important class of AI calculations. On average, in other words, the performance of these chips more than doubled every year, a rate of progress that makes Moore's Law pale in comparison.
Nvidia's specialty has long been graphics processing units, or GPUs, which operate efficiently when there are many independent tasks to be done simultaneously. Central processing units, or CPUs, like the kind that Intel specializes in, are on the other hand much less efficient but better at executing a single, serial task very quickly. You can't chop up every computing process so that it can be efficiently handled by a GPU, but for the ones you can -- including many AI applications -- you can perform it many times as fast while expending the same power.
Chipzilla was a primary promoter of Moore's Law, but it was not the only one as perpetuating it required tens of thousands of engineers and billions of dollars in investment across hundreds of companies around the globe.
Nvidia isn't alone in driving Huang's Law -- and in fact its own type of AI processing might, in some applications, be losing its appeal. That's probably a major reason it has moved to acquire chip architect ARM Holdings this month, another company key to ongoing improvement in the speed of AI, for $40 billion.