The GDDR6 accelerator-in-memory technology is designed for artificial intelligence and big data processing by bringing basic computational functions to memory chips.
SK Hynix's GDDR6-AIM chips can process data in memory at 16 Gbps, which makes certain computations up to 16 times faster than other methods.
The company said that such chips are designed for machine learning, high-performance computing, and big data computation and storage. These types of workloads may not always need truly serious computing performance, but transferring data from memory to a processor takes time and consumes loads of power.
SK Hynix claims the GDDR6-AiM chips run at 1.25V, and its usage reduces power consumption by 80 per cent compared to applications that move data to the CPU and GPU. Such chips are designed to be drop-in compatible with existing GDDR6 memory controllers, so it should be possible to use them even on existing graphics cards to increase their performance in AI, ML, Big Data, and HPC workloads.
The company is not the only one playing around with processing-in-memory (PIM) technology. Samsung has demonstrated its HBM2 and GDDR6 memory with embedded processing for about two years, but no one seems to be that interested yet.
SK Hynix also plans to demonstrate its new HBM3 memory devices 'with the world's best specification for high-performance computing.'