GTC 2020: Scaling Data by 109x and Compute for Deep-Learning Applications

GTC 2020 S21390
Presenters: John Taylor,DST/CSIRO ; Pablo Rozas Larraondo,Australian National University
Abstract
We’ll explore the scalable applications of artificial intelligence on massive data sets. First, we’ll cover how we optimized and developed highly parallelized implementations of DL algorithms and tested them on HPC GPU clusters. Then we’ll demonstrate how to develop models that can run over large high-resolution datasets, identifying the spatial and temporal relationships between physical parameters in global-scale high-resolution numerical weather prediction models.

Watch this session
Join in the conversation below.

Video not working