Video: Introduction to Recurrent Neural Networks in TensorRT

Originally published at:

NVIDIA TensorRT™ is a high-performance deep learning inference optimizer and runtime that delivers low latency and high-throughput. TensorRT can import trained models from every deep learning framework to easily create highly efficient inference engines that can be incorporated into larger applications and services. This video demonstrates how to configure a simple Recurrent Neural Network (RNN)…

is there any pricing for NVIDIA TensorRT Inference Server