Deploying Diverse AI Model Categories from Public Model Zoo Using NVIDIA Triton Inference Server

Originally published at: https://developer.nvidia.com/blog/deploying-diverse-ai-model-categories-from-public-model-zoo-using-nvidia-triton-inference-server/

Nowadays, a huge number of implementations of state-of-the-art (SOTA) models and modeling solutions are present for different frameworks like TensorFlow, ONNX, PyTorch, Keras, MXNet, and so on. These models can be used for out-of-the-box inference if you are interested in categories already in the datasets, or they can be embedded to custom business scenarios with…

We hope you find this post a helpful starting point for AI model inference. If you have any questions or comments, let us know

Great blog post! I wanted to get the example code, but the link in the post appears to be broken. How can I get the code?

Thank you for the feedback. The code should be accessible now; please check it again.