Published in AI

Microsoft announces Project Brainwave

by on08 May 2018


AI modules with specialised chips for Brainiacs 

Software King of the World Microsoft has announced that its system for running AI models with specialised chips is now available in preview on Azure.

Dubbed Project Brainwave it allows developers to deploy machine learning models onto programmable silicon and achieve high performance beyond what they’d be able to get from a CPU or GPU. Microsoft claims Project Brainwave makes Azure the fastest cloud to run real-time AI today.

Brainwave uses field programmable gate array (FPGA) chips to achieve its speed and low latency. It can be updated often to accelerate AI chores with the latest algorithms, and it handles AI tasks rapidly enough to be used for real-time jobs where response time is crucial. Second, customers eventually can run the AI jobs with Microsoft hardware at their own sites, and not just by tapping into Microsoft's data centers, which speeds up operations.

 Brainwave on Azure will use Intel Stratix 10 chips and be able to support ResNet50-based neural networks.

Microsoft plans to add other AI tools to Project Brainwave.

"We'll be expanding the types of workloads," said Mark Russinovich, chief technology officer of Microsoft's Azure service. Although, curiously, it turns out that image-recognition AI tools can be pretty versatile. "Internally at Microsoft, we use imaging deep neural networks to classify malware," he said.

Microsoft has been talking about Brainwave for serving AI models since last summer, and in March said it’s being used to make AI that powers Bing search results 10 times faster.

The news was announced onstage at Build, the Microsoft annual developer conference being held May 7-9 at the Washington State Convention Center in Seattle, Washington.

 

Last modified on 08 May 2018
Rate this item
(0 votes)

Read more about: