GTC 2020: Workstation Inference with TensorRT, cuDNN, and WinML

GTC 2020 CWE21166
Presenters: ,
Abstract
Our experts are highly experienced with moving AI Inference models from research to production environments and are happy to share these experiences, tools, and techniques with you, including topics such as:

  • Moving from research to production
  • Minimizing device memory usage
  • Performance optimization
  • Integration with existing code bases

Join us to learn more about the constraints related to deployment of AI inference models on Windows workstations using a local GPU.

Watch this session
Join in the conversation below.