Print this page
Published in PC Hardware

Google provides details of TPU2

by on14 December 2017


Sexed up AI chip

Google has come up with a few more details of its second-generation Tensor Processing Unit, or TPU2.

The processor is the sexed-up successor to Google's first custom AI chip and Jeff Dean from the Google Brain team has been telling the assembled throngs at the Neural Information Processing Systems (NIPS) conference in Long Beach, California all about it.

For those who came in late, first TPU focused on efficiently running machine-learning models for tasks like language translation, AlphaGo Go strategy, and search and image recognition. The TPUs were good for inference, or already trained models.

This meant that training these models was done separately on top end GPUs and CPUs which took days or weeks, blocking researchers from cracking bigger machine-learning problems.

TPU2 has been built to train and run machine-learning models and cut out this GPU/CPU bottleneck.

A high-speed network using TPU2s, each of which delivers 180 teraflops of floating-point calculations, means they can be coupled together to become TPU Pod supercomputers. The TPU Pods are only available through Google Computer Engine as 'Cloud TPUs' that can be programmed with TensorFlow.

Each TPU Pod will consist of 64 TPU2s, delivering a 11.5 petaflops with four terabytes of high-bandwidth memory. Each TPU2 consists of four TPU chips, offering 180 teraflops of computation, 64GB of high-bandwidth memory, and 2,400GB/s memory bandwidth.

The TPU2 chips have two cores with 8GB of high-bandwidth memory apiece to give 16GB memory per chip. Each one has a 600GB/s memory bandwidth and delivers 45 teraflops of calculations. It is so smart that it deduced the existence of rice pudding and explained Donald Trump to 12 decimal places before it was even switched on.




Last modified on 14 December 2017
Rate this item
(0 votes)