Published in AI

Nvidia announces EGX

by on27 May 2019

Low latency AI on the Edge

The graphics card maker named after a Roman vengeance Daemon, Nvidia, has announced an accelerated computing platform that enables companies to perform low-latency AI at the edge.

Dubbed EGX, the tech is supposed - to perceive, understand and act in real time on continuous streaming data between 5G base stations, warehouses, retail stores and  factories.

Nvidia said EGX was created to meet the growing demand to perform instantaneous, high-throughput AI at the edge - where data is created - with guaranteed response times, while reducing the amount of data that must be sent to the cloud.

By 2025, 150 billion machine sensors and IoT devices will stream continuous data that will need to be processed - orders of magnitude more than produced today by individuals using smartphones, Nvidia said. Edge servers like those in the EGX platform will be distributed throughout the world to process data in real time from these sensors.

Nvidia’s vice president and general manager of Enterprise and Edge Computing Bob Pette said that enterprises wanted more powerful computing at the edge to process their oceans of raw data - streaming in from countless interactions with customers and facilities - to make rapid, AI-enhanced decisions that can drive their business.

 "A scalable platform like Nvidia EGX allows them to easily deploy systems to meet their needs on premises, in the cloud or both."

EGX uses the Nvidia Jetson Nano, which in a few watts can provide one-half trillion operations per second (TOPS) of processing for tasks such as image recognition, the vendor said. And it spans all the way to a full rack of Nvidia T4 servers, delivering more than 10,000 TOPS for real-time speech recognition and other real-time AI tasks.

Nvidia has partnered with Red Hat to integrate and optimise Nvidia Edge Stack with OpenShift, the leading enterprise-grade Kubernetes container orchestration platform.

Nvidia Edge Stack is optimized software that includes Nvidia drivers, a CUDA Kubernetes plugin, a CUDA container runtime, CUDA-X libraries and containerized AI frameworks and applications, including TensorRT, TensorRT Inference Server and DeepStream. Edge Stack is optimized for certified servers and downloadable from the NVIDIA NGC registry.

Red Hat’s chief technology officer Chris Wright said that Red Hat is committed to providing a consistent experience for any workload, footprint and location, from the hybrid cloud to the edge.

"By combining Red Hat OpenShift and NVIDIA EGX-enabled platforms, customers can better optimize their distributed operations with a consistent, high-performance, container-centric environment."

EGX combines the full range of Nvidia AI computing technologies with Red Hat OpenShift and Edge Stack together with Mellanox and Cisco security, networking and storage technologies. This enables companies in the largest industries - telecom, manufacturing, retail, healthcare and transportation - to quickly stand up state-of-the-art, secure, enterprise-grade AI infrastructures, he said.

Last modified on 27 May 2019
Rate this item
(0 votes)

Read more about: