Accelerating IO in the Modern Data Center: Magnum IO Storage Partnerships

Originally published at: Accelerating IO in the Modern Data Center: Magnum IO Storage Partnerships | NVIDIA Developer Blog

With computation shifting from the CPU to faster GPUs for AI, ML and HPC applications, IO into and out of the GPU can become the primary bottleneck to the overall application performance. NVIDIA created Magnum IO GPUDirect Storage (GDS) to streamline data movement between storage and GPU memory and remove performance bottlenecks in the platform,…

We’re delighted to have so many effective partners to work with who are working to broaden the adoption of Accelerated Storage IO to GPUs in modern data centers. Come check out a Birds of a Feature session at SC’21 on Tue Nov 16, 12:15-1:15 CST - see Presentation • SC21 and Accelerating Storage IO to GPUs - Google Docs.