NVIDIA TensorRT Inference Server Now Open Source

Originally published at: https://developer.nvidia.com/blog/nvidia-tensorrt-inference-server-now-open-source/

In September 2018, NVIDIA introduced NVIDIA TensorRT Inference Server, a production-ready solution for data center inference deployments. TensorRT Inference Server maximizes GPU utilization, supports all popular AI frameworks, and eliminates writing inference stacks from scratch. You can learn more about TensorRT Inference Server in this NVIDIA Developer blog post. Today we are announcing that NVIDIA…