GTC 2020: Clara Developer Day: Federated Learning using Clara Train SDK

GTC 2020 S22564
Presenters: Nicola Rieke,NVIDIA
Abstract
Federated Learning techniques enable training robust AI models in a de-centralized manner – meaning that the models can learn from diverse data but that data doesn’t leave the local site and always stays secure. This is achieved by sharing model-weights or partial model weights from each local client and aggregating these on a server that never accesses the source data. In this session we will deep dive into the federated learning architecture of latest Clara Train SDK. We will cover the core concepts of Federated Learning and the different collaborative learning techniques. Afterward, we will dive deeper into how using the Clara Train SDK enables privacy-preserving Federated Learning. This session will also cover the ease of bringing up Federated Learning clients and establishing communication between various clients and a server for model aggregation.

Watch this session
Join in the conversation below.